Jan 23 04:00:49 np0005593295 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 23 04:00:49 np0005593295 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 23 04:00:49 np0005593295 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 04:00:49 np0005593295 kernel: BIOS-provided physical RAM map:
Jan 23 04:00:49 np0005593295 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 23 04:00:49 np0005593295 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 23 04:00:49 np0005593295 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 23 04:00:49 np0005593295 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 23 04:00:49 np0005593295 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 23 04:00:49 np0005593295 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 23 04:00:49 np0005593295 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 23 04:00:49 np0005593295 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 23 04:00:49 np0005593295 kernel: NX (Execute Disable) protection: active
Jan 23 04:00:49 np0005593295 kernel: APIC: Static calls initialized
Jan 23 04:00:49 np0005593295 kernel: SMBIOS 2.8 present.
Jan 23 04:00:49 np0005593295 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 23 04:00:49 np0005593295 kernel: Hypervisor detected: KVM
Jan 23 04:00:49 np0005593295 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 23 04:00:49 np0005593295 kernel: kvm-clock: using sched offset of 3156574041 cycles
Jan 23 04:00:49 np0005593295 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 23 04:00:49 np0005593295 kernel: tsc: Detected 2799.998 MHz processor
Jan 23 04:00:49 np0005593295 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 23 04:00:49 np0005593295 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 23 04:00:49 np0005593295 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 23 04:00:49 np0005593295 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 23 04:00:49 np0005593295 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 23 04:00:49 np0005593295 kernel: Using GB pages for direct mapping
Jan 23 04:00:49 np0005593295 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 23 04:00:49 np0005593295 kernel: ACPI: Early table checksum verification disabled
Jan 23 04:00:49 np0005593295 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 23 04:00:49 np0005593295 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:49 np0005593295 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:49 np0005593295 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:49 np0005593295 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 23 04:00:49 np0005593295 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:49 np0005593295 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 04:00:49 np0005593295 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 23 04:00:49 np0005593295 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 23 04:00:49 np0005593295 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 23 04:00:49 np0005593295 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 23 04:00:49 np0005593295 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 23 04:00:49 np0005593295 kernel: No NUMA configuration found
Jan 23 04:00:49 np0005593295 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 23 04:00:49 np0005593295 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 23 04:00:49 np0005593295 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 23 04:00:49 np0005593295 kernel: Zone ranges:
Jan 23 04:00:49 np0005593295 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 23 04:00:49 np0005593295 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 23 04:00:49 np0005593295 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 04:00:49 np0005593295 kernel:  Device   empty
Jan 23 04:00:49 np0005593295 kernel: Movable zone start for each node
Jan 23 04:00:49 np0005593295 kernel: Early memory node ranges
Jan 23 04:00:49 np0005593295 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 23 04:00:49 np0005593295 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 23 04:00:49 np0005593295 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 04:00:49 np0005593295 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 23 04:00:49 np0005593295 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 23 04:00:49 np0005593295 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 23 04:00:49 np0005593295 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 23 04:00:49 np0005593295 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 23 04:00:49 np0005593295 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 23 04:00:49 np0005593295 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 23 04:00:49 np0005593295 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 23 04:00:49 np0005593295 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 23 04:00:49 np0005593295 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 23 04:00:49 np0005593295 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 23 04:00:49 np0005593295 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 23 04:00:49 np0005593295 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 23 04:00:49 np0005593295 kernel: TSC deadline timer available
Jan 23 04:00:49 np0005593295 kernel: CPU topo: Max. logical packages:   8
Jan 23 04:00:49 np0005593295 kernel: CPU topo: Max. logical dies:       8
Jan 23 04:00:49 np0005593295 kernel: CPU topo: Max. dies per package:   1
Jan 23 04:00:49 np0005593295 kernel: CPU topo: Max. threads per core:   1
Jan 23 04:00:49 np0005593295 kernel: CPU topo: Num. cores per package:     1
Jan 23 04:00:49 np0005593295 kernel: CPU topo: Num. threads per package:   1
Jan 23 04:00:49 np0005593295 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 23 04:00:49 np0005593295 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 23 04:00:49 np0005593295 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 23 04:00:49 np0005593295 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 23 04:00:49 np0005593295 kernel: Booting paravirtualized kernel on KVM
Jan 23 04:00:49 np0005593295 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 23 04:00:49 np0005593295 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 23 04:00:49 np0005593295 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 23 04:00:49 np0005593295 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 23 04:00:49 np0005593295 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 04:00:49 np0005593295 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 23 04:00:49 np0005593295 kernel: random: crng init done
Jan 23 04:00:49 np0005593295 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: Fallback order for Node 0: 0 
Jan 23 04:00:49 np0005593295 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 23 04:00:49 np0005593295 kernel: Policy zone: Normal
Jan 23 04:00:49 np0005593295 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 23 04:00:49 np0005593295 kernel: software IO TLB: area num 8.
Jan 23 04:00:49 np0005593295 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 23 04:00:49 np0005593295 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 23 04:00:49 np0005593295 kernel: ftrace: allocated 194 pages with 3 groups
Jan 23 04:00:49 np0005593295 kernel: Dynamic Preempt: voluntary
Jan 23 04:00:49 np0005593295 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 23 04:00:49 np0005593295 kernel: rcu: #011RCU event tracing is enabled.
Jan 23 04:00:49 np0005593295 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 23 04:00:49 np0005593295 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 23 04:00:49 np0005593295 kernel: #011Rude variant of Tasks RCU enabled.
Jan 23 04:00:49 np0005593295 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 23 04:00:49 np0005593295 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 23 04:00:49 np0005593295 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 23 04:00:49 np0005593295 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 04:00:49 np0005593295 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 04:00:49 np0005593295 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 04:00:49 np0005593295 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 23 04:00:49 np0005593295 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 23 04:00:49 np0005593295 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 23 04:00:49 np0005593295 kernel: Console: colour VGA+ 80x25
Jan 23 04:00:49 np0005593295 kernel: printk: console [ttyS0] enabled
Jan 23 04:00:49 np0005593295 kernel: ACPI: Core revision 20230331
Jan 23 04:00:49 np0005593295 kernel: APIC: Switch to symmetric I/O mode setup
Jan 23 04:00:49 np0005593295 kernel: x2apic enabled
Jan 23 04:00:49 np0005593295 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 23 04:00:49 np0005593295 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 23 04:00:49 np0005593295 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 23 04:00:49 np0005593295 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 23 04:00:49 np0005593295 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 23 04:00:49 np0005593295 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 23 04:00:49 np0005593295 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 23 04:00:49 np0005593295 kernel: Spectre V2 : Mitigation: Retpolines
Jan 23 04:00:49 np0005593295 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 23 04:00:49 np0005593295 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 23 04:00:49 np0005593295 kernel: RETBleed: Mitigation: untrained return thunk
Jan 23 04:00:49 np0005593295 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 23 04:00:49 np0005593295 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 23 04:00:49 np0005593295 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 23 04:00:49 np0005593295 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 23 04:00:49 np0005593295 kernel: x86/bugs: return thunk changed
Jan 23 04:00:49 np0005593295 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 23 04:00:49 np0005593295 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 23 04:00:49 np0005593295 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 23 04:00:49 np0005593295 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 23 04:00:49 np0005593295 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 23 04:00:49 np0005593295 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 23 04:00:49 np0005593295 kernel: Freeing SMP alternatives memory: 40K
Jan 23 04:00:49 np0005593295 kernel: pid_max: default: 32768 minimum: 301
Jan 23 04:00:49 np0005593295 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 23 04:00:49 np0005593295 kernel: landlock: Up and running.
Jan 23 04:00:49 np0005593295 kernel: Yama: becoming mindful.
Jan 23 04:00:49 np0005593295 kernel: SELinux:  Initializing.
Jan 23 04:00:49 np0005593295 kernel: LSM support for eBPF active
Jan 23 04:00:49 np0005593295 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 23 04:00:49 np0005593295 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 23 04:00:49 np0005593295 kernel: ... version:                0
Jan 23 04:00:49 np0005593295 kernel: ... bit width:              48
Jan 23 04:00:49 np0005593295 kernel: ... generic registers:      6
Jan 23 04:00:49 np0005593295 kernel: ... value mask:             0000ffffffffffff
Jan 23 04:00:49 np0005593295 kernel: ... max period:             00007fffffffffff
Jan 23 04:00:49 np0005593295 kernel: ... fixed-purpose events:   0
Jan 23 04:00:49 np0005593295 kernel: ... event mask:             000000000000003f
Jan 23 04:00:49 np0005593295 kernel: signal: max sigframe size: 1776
Jan 23 04:00:49 np0005593295 kernel: rcu: Hierarchical SRCU implementation.
Jan 23 04:00:49 np0005593295 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 23 04:00:49 np0005593295 kernel: smp: Bringing up secondary CPUs ...
Jan 23 04:00:49 np0005593295 kernel: smpboot: x86: Booting SMP configuration:
Jan 23 04:00:49 np0005593295 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 23 04:00:49 np0005593295 kernel: smp: Brought up 1 node, 8 CPUs
Jan 23 04:00:49 np0005593295 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 23 04:00:49 np0005593295 kernel: node 0 deferred pages initialised in 10ms
Jan 23 04:00:49 np0005593295 kernel: Memory: 7763764K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 23 04:00:49 np0005593295 kernel: devtmpfs: initialized
Jan 23 04:00:49 np0005593295 kernel: x86/mm: Memory block size: 128MB
Jan 23 04:00:49 np0005593295 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 23 04:00:49 np0005593295 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 23 04:00:49 np0005593295 kernel: pinctrl core: initialized pinctrl subsystem
Jan 23 04:00:49 np0005593295 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 23 04:00:49 np0005593295 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 23 04:00:49 np0005593295 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 23 04:00:49 np0005593295 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 23 04:00:49 np0005593295 kernel: audit: initializing netlink subsys (disabled)
Jan 23 04:00:49 np0005593295 kernel: audit: type=2000 audit(1769158847.233:1): state=initialized audit_enabled=0 res=1
Jan 23 04:00:49 np0005593295 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 23 04:00:49 np0005593295 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 23 04:00:49 np0005593295 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 23 04:00:49 np0005593295 kernel: cpuidle: using governor menu
Jan 23 04:00:49 np0005593295 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 23 04:00:49 np0005593295 kernel: PCI: Using configuration type 1 for base access
Jan 23 04:00:49 np0005593295 kernel: PCI: Using configuration type 1 for extended access
Jan 23 04:00:49 np0005593295 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 23 04:00:49 np0005593295 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 23 04:00:49 np0005593295 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 23 04:00:49 np0005593295 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 23 04:00:49 np0005593295 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 23 04:00:49 np0005593295 kernel: Demotion targets for Node 0: null
Jan 23 04:00:49 np0005593295 kernel: cryptd: max_cpu_qlen set to 1000
Jan 23 04:00:49 np0005593295 kernel: ACPI: Added _OSI(Module Device)
Jan 23 04:00:49 np0005593295 kernel: ACPI: Added _OSI(Processor Device)
Jan 23 04:00:49 np0005593295 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 23 04:00:49 np0005593295 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 23 04:00:49 np0005593295 kernel: ACPI: Interpreter enabled
Jan 23 04:00:49 np0005593295 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 23 04:00:49 np0005593295 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 23 04:00:49 np0005593295 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 23 04:00:49 np0005593295 kernel: PCI: Using E820 reservations for host bridge windows
Jan 23 04:00:49 np0005593295 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 23 04:00:49 np0005593295 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 23 04:00:49 np0005593295 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [3] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [4] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [5] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [6] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [7] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [8] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [9] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [10] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [11] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [12] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [13] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [14] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [15] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [16] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [17] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [18] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [19] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [20] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [21] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [22] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [23] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [24] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [25] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [26] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [27] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [28] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [29] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [30] registered
Jan 23 04:00:49 np0005593295 kernel: acpiphp: Slot [31] registered
Jan 23 04:00:49 np0005593295 kernel: PCI host bridge to bus 0000:00
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 23 04:00:49 np0005593295 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 23 04:00:49 np0005593295 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 23 04:00:49 np0005593295 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 23 04:00:49 np0005593295 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 23 04:00:49 np0005593295 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 23 04:00:49 np0005593295 kernel: iommu: Default domain type: Translated
Jan 23 04:00:49 np0005593295 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 23 04:00:49 np0005593295 kernel: SCSI subsystem initialized
Jan 23 04:00:49 np0005593295 kernel: ACPI: bus type USB registered
Jan 23 04:00:49 np0005593295 kernel: usbcore: registered new interface driver usbfs
Jan 23 04:00:49 np0005593295 kernel: usbcore: registered new interface driver hub
Jan 23 04:00:49 np0005593295 kernel: usbcore: registered new device driver usb
Jan 23 04:00:49 np0005593295 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 23 04:00:49 np0005593295 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 23 04:00:49 np0005593295 kernel: PTP clock support registered
Jan 23 04:00:49 np0005593295 kernel: EDAC MC: Ver: 3.0.0
Jan 23 04:00:49 np0005593295 kernel: NetLabel: Initializing
Jan 23 04:00:49 np0005593295 kernel: NetLabel:  domain hash size = 128
Jan 23 04:00:49 np0005593295 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 23 04:00:49 np0005593295 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 23 04:00:49 np0005593295 kernel: PCI: Using ACPI for IRQ routing
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 23 04:00:49 np0005593295 kernel: vgaarb: loaded
Jan 23 04:00:49 np0005593295 kernel: clocksource: Switched to clocksource kvm-clock
Jan 23 04:00:49 np0005593295 kernel: VFS: Disk quotas dquot_6.6.0
Jan 23 04:00:49 np0005593295 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 23 04:00:49 np0005593295 kernel: pnp: PnP ACPI init
Jan 23 04:00:49 np0005593295 kernel: pnp: PnP ACPI: found 5 devices
Jan 23 04:00:49 np0005593295 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 23 04:00:49 np0005593295 kernel: NET: Registered PF_INET protocol family
Jan 23 04:00:49 np0005593295 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 23 04:00:49 np0005593295 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 04:00:49 np0005593295 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 23 04:00:49 np0005593295 kernel: NET: Registered PF_XDP protocol family
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 23 04:00:49 np0005593295 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 23 04:00:49 np0005593295 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 23 04:00:49 np0005593295 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 87300 usecs
Jan 23 04:00:49 np0005593295 kernel: PCI: CLS 0 bytes, default 64
Jan 23 04:00:49 np0005593295 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 23 04:00:49 np0005593295 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 23 04:00:49 np0005593295 kernel: ACPI: bus type thunderbolt registered
Jan 23 04:00:49 np0005593295 kernel: Trying to unpack rootfs image as initramfs...
Jan 23 04:00:49 np0005593295 kernel: Initialise system trusted keyrings
Jan 23 04:00:49 np0005593295 kernel: Key type blacklist registered
Jan 23 04:00:49 np0005593295 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 23 04:00:49 np0005593295 kernel: zbud: loaded
Jan 23 04:00:49 np0005593295 kernel: integrity: Platform Keyring initialized
Jan 23 04:00:49 np0005593295 kernel: integrity: Machine keyring initialized
Jan 23 04:00:49 np0005593295 kernel: Freeing initrd memory: 87956K
Jan 23 04:00:49 np0005593295 kernel: NET: Registered PF_ALG protocol family
Jan 23 04:00:49 np0005593295 kernel: xor: automatically using best checksumming function   avx       
Jan 23 04:00:49 np0005593295 kernel: Key type asymmetric registered
Jan 23 04:00:49 np0005593295 kernel: Asymmetric key parser 'x509' registered
Jan 23 04:00:49 np0005593295 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 23 04:00:49 np0005593295 kernel: io scheduler mq-deadline registered
Jan 23 04:00:49 np0005593295 kernel: io scheduler kyber registered
Jan 23 04:00:49 np0005593295 kernel: io scheduler bfq registered
Jan 23 04:00:49 np0005593295 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 23 04:00:49 np0005593295 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 23 04:00:49 np0005593295 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 23 04:00:49 np0005593295 kernel: ACPI: button: Power Button [PWRF]
Jan 23 04:00:49 np0005593295 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 23 04:00:49 np0005593295 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 23 04:00:49 np0005593295 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 23 04:00:49 np0005593295 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 23 04:00:49 np0005593295 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 23 04:00:49 np0005593295 kernel: Non-volatile memory driver v1.3
Jan 23 04:00:49 np0005593295 kernel: rdac: device handler registered
Jan 23 04:00:49 np0005593295 kernel: hp_sw: device handler registered
Jan 23 04:00:49 np0005593295 kernel: emc: device handler registered
Jan 23 04:00:49 np0005593295 kernel: alua: device handler registered
Jan 23 04:00:49 np0005593295 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 23 04:00:49 np0005593295 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 23 04:00:49 np0005593295 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 23 04:00:49 np0005593295 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 23 04:00:49 np0005593295 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 23 04:00:49 np0005593295 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 23 04:00:49 np0005593295 kernel: usb usb1: Product: UHCI Host Controller
Jan 23 04:00:49 np0005593295 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 23 04:00:49 np0005593295 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 23 04:00:49 np0005593295 kernel: hub 1-0:1.0: USB hub found
Jan 23 04:00:49 np0005593295 kernel: hub 1-0:1.0: 2 ports detected
Jan 23 04:00:49 np0005593295 kernel: usbcore: registered new interface driver usbserial_generic
Jan 23 04:00:49 np0005593295 kernel: usbserial: USB Serial support registered for generic
Jan 23 04:00:49 np0005593295 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 23 04:00:49 np0005593295 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 23 04:00:49 np0005593295 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 23 04:00:49 np0005593295 kernel: mousedev: PS/2 mouse device common for all mice
Jan 23 04:00:49 np0005593295 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 23 04:00:49 np0005593295 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 23 04:00:49 np0005593295 kernel: rtc_cmos 00:04: registered as rtc0
Jan 23 04:00:49 np0005593295 kernel: rtc_cmos 00:04: setting system clock to 2026-01-23T09:00:48 UTC (1769158848)
Jan 23 04:00:49 np0005593295 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 23 04:00:49 np0005593295 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 23 04:00:49 np0005593295 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 23 04:00:49 np0005593295 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 23 04:00:49 np0005593295 kernel: usbcore: registered new interface driver usbhid
Jan 23 04:00:49 np0005593295 kernel: usbhid: USB HID core driver
Jan 23 04:00:49 np0005593295 kernel: drop_monitor: Initializing network drop monitor service
Jan 23 04:00:49 np0005593295 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 23 04:00:49 np0005593295 kernel: Initializing XFRM netlink socket
Jan 23 04:00:49 np0005593295 kernel: NET: Registered PF_INET6 protocol family
Jan 23 04:00:49 np0005593295 kernel: Segment Routing with IPv6
Jan 23 04:00:49 np0005593295 kernel: NET: Registered PF_PACKET protocol family
Jan 23 04:00:49 np0005593295 kernel: mpls_gso: MPLS GSO support
Jan 23 04:00:49 np0005593295 kernel: IPI shorthand broadcast: enabled
Jan 23 04:00:49 np0005593295 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 23 04:00:49 np0005593295 kernel: AES CTR mode by8 optimization enabled
Jan 23 04:00:49 np0005593295 kernel: sched_clock: Marking stable (1273006764, 157430312)->(1579641599, -149204523)
Jan 23 04:00:49 np0005593295 kernel: registered taskstats version 1
Jan 23 04:00:49 np0005593295 kernel: Loading compiled-in X.509 certificates
Jan 23 04:00:49 np0005593295 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 04:00:49 np0005593295 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 23 04:00:49 np0005593295 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 23 04:00:49 np0005593295 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 23 04:00:49 np0005593295 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 23 04:00:49 np0005593295 kernel: Demotion targets for Node 0: null
Jan 23 04:00:49 np0005593295 kernel: page_owner is disabled
Jan 23 04:00:49 np0005593295 kernel: Key type .fscrypt registered
Jan 23 04:00:49 np0005593295 kernel: Key type fscrypt-provisioning registered
Jan 23 04:00:49 np0005593295 kernel: Key type big_key registered
Jan 23 04:00:49 np0005593295 kernel: Key type encrypted registered
Jan 23 04:00:49 np0005593295 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 23 04:00:49 np0005593295 kernel: Loading compiled-in module X.509 certificates
Jan 23 04:00:49 np0005593295 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 04:00:49 np0005593295 kernel: ima: Allocated hash algorithm: sha256
Jan 23 04:00:49 np0005593295 kernel: ima: No architecture policies found
Jan 23 04:00:49 np0005593295 kernel: evm: Initialising EVM extended attributes:
Jan 23 04:00:49 np0005593295 kernel: evm: security.selinux
Jan 23 04:00:49 np0005593295 kernel: evm: security.SMACK64 (disabled)
Jan 23 04:00:49 np0005593295 kernel: evm: security.SMACK64EXEC (disabled)
Jan 23 04:00:49 np0005593295 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 23 04:00:49 np0005593295 kernel: evm: security.SMACK64MMAP (disabled)
Jan 23 04:00:49 np0005593295 kernel: evm: security.apparmor (disabled)
Jan 23 04:00:49 np0005593295 kernel: evm: security.ima
Jan 23 04:00:49 np0005593295 kernel: evm: security.capability
Jan 23 04:00:49 np0005593295 kernel: evm: HMAC attrs: 0x1
Jan 23 04:00:49 np0005593295 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 23 04:00:49 np0005593295 kernel: Running certificate verification RSA selftest
Jan 23 04:00:49 np0005593295 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 23 04:00:49 np0005593295 kernel: Running certificate verification ECDSA selftest
Jan 23 04:00:49 np0005593295 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 23 04:00:49 np0005593295 kernel: clk: Disabling unused clocks
Jan 23 04:00:49 np0005593295 kernel: Freeing unused decrypted memory: 2028K
Jan 23 04:00:49 np0005593295 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 23 04:00:49 np0005593295 kernel: Write protecting the kernel read-only data: 30720k
Jan 23 04:00:49 np0005593295 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 23 04:00:49 np0005593295 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 23 04:00:49 np0005593295 kernel: Run /init as init process
Jan 23 04:00:49 np0005593295 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 04:00:49 np0005593295 systemd: Detected virtualization kvm.
Jan 23 04:00:49 np0005593295 systemd: Detected architecture x86-64.
Jan 23 04:00:49 np0005593295 systemd: Running in initrd.
Jan 23 04:00:49 np0005593295 systemd: No hostname configured, using default hostname.
Jan 23 04:00:49 np0005593295 systemd: Hostname set to <localhost>.
Jan 23 04:00:49 np0005593295 systemd: Initializing machine ID from VM UUID.
Jan 23 04:00:49 np0005593295 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 23 04:00:49 np0005593295 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 23 04:00:49 np0005593295 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 23 04:00:49 np0005593295 kernel: usb 1-1: Manufacturer: QEMU
Jan 23 04:00:49 np0005593295 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 23 04:00:49 np0005593295 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 23 04:00:49 np0005593295 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 23 04:00:49 np0005593295 systemd: Queued start job for default target Initrd Default Target.
Jan 23 04:00:49 np0005593295 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 04:00:49 np0005593295 systemd: Reached target Local Encrypted Volumes.
Jan 23 04:00:49 np0005593295 systemd: Reached target Initrd /usr File System.
Jan 23 04:00:49 np0005593295 systemd: Reached target Local File Systems.
Jan 23 04:00:49 np0005593295 systemd: Reached target Path Units.
Jan 23 04:00:49 np0005593295 systemd: Reached target Slice Units.
Jan 23 04:00:49 np0005593295 systemd: Reached target Swaps.
Jan 23 04:00:49 np0005593295 systemd: Reached target Timer Units.
Jan 23 04:00:49 np0005593295 systemd: Listening on D-Bus System Message Bus Socket.
Jan 23 04:00:49 np0005593295 systemd: Listening on Journal Socket (/dev/log).
Jan 23 04:00:49 np0005593295 systemd: Listening on Journal Socket.
Jan 23 04:00:49 np0005593295 systemd: Listening on udev Control Socket.
Jan 23 04:00:49 np0005593295 systemd: Listening on udev Kernel Socket.
Jan 23 04:00:49 np0005593295 systemd: Reached target Socket Units.
Jan 23 04:00:49 np0005593295 systemd: Starting Create List of Static Device Nodes...
Jan 23 04:00:49 np0005593295 systemd: Starting Journal Service...
Jan 23 04:00:49 np0005593295 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 04:00:49 np0005593295 systemd: Starting Apply Kernel Variables...
Jan 23 04:00:49 np0005593295 systemd: Starting Create System Users...
Jan 23 04:00:49 np0005593295 systemd: Starting Setup Virtual Console...
Jan 23 04:00:49 np0005593295 systemd: Finished Create List of Static Device Nodes.
Jan 23 04:00:49 np0005593295 systemd: Finished Apply Kernel Variables.
Jan 23 04:00:49 np0005593295 systemd: Finished Create System Users.
Jan 23 04:00:49 np0005593295 systemd-journald[306]: Journal started
Jan 23 04:00:49 np0005593295 systemd-journald[306]: Runtime Journal (/run/log/journal/84c28ede41124d768f99c7405a7d029c) is 8.0M, max 153.6M, 145.6M free.
Jan 23 04:00:49 np0005593295 systemd-sysusers[311]: Creating group 'users' with GID 100.
Jan 23 04:00:49 np0005593295 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Jan 23 04:00:49 np0005593295 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 23 04:00:49 np0005593295 systemd: Started Journal Service.
Jan 23 04:00:49 np0005593295 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 04:00:49 np0005593295 systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 04:00:49 np0005593295 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 04:00:49 np0005593295 systemd[1]: Finished Setup Virtual Console.
Jan 23 04:00:49 np0005593295 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 23 04:00:49 np0005593295 systemd[1]: Starting dracut cmdline hook...
Jan 23 04:00:49 np0005593295 dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Jan 23 04:00:49 np0005593295 dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 04:00:49 np0005593295 systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 04:00:49 np0005593295 systemd[1]: Finished dracut cmdline hook.
Jan 23 04:00:49 np0005593295 systemd[1]: Starting dracut pre-udev hook...
Jan 23 04:00:49 np0005593295 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 23 04:00:49 np0005593295 kernel: device-mapper: uevent: version 1.0.3
Jan 23 04:00:49 np0005593295 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 23 04:00:49 np0005593295 kernel: RPC: Registered named UNIX socket transport module.
Jan 23 04:00:49 np0005593295 kernel: RPC: Registered udp transport module.
Jan 23 04:00:49 np0005593295 kernel: RPC: Registered tcp transport module.
Jan 23 04:00:49 np0005593295 kernel: RPC: Registered tcp-with-tls transport module.
Jan 23 04:00:49 np0005593295 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 23 04:00:49 np0005593295 rpc.statd[442]: Version 2.5.4 starting
Jan 23 04:00:49 np0005593295 rpc.statd[442]: Initializing NSM state
Jan 23 04:00:49 np0005593295 rpc.idmapd[447]: Setting log level to 0
Jan 23 04:00:49 np0005593295 systemd[1]: Finished dracut pre-udev hook.
Jan 23 04:00:49 np0005593295 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 04:00:49 np0005593295 systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 04:00:49 np0005593295 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 04:00:49 np0005593295 systemd[1]: Starting dracut pre-trigger hook...
Jan 23 04:00:49 np0005593295 systemd[1]: Finished dracut pre-trigger hook.
Jan 23 04:00:49 np0005593295 systemd[1]: Starting Coldplug All udev Devices...
Jan 23 04:00:49 np0005593295 systemd[1]: Created slice Slice /system/modprobe.
Jan 23 04:00:49 np0005593295 systemd[1]: Starting Load Kernel Module configfs...
Jan 23 04:00:49 np0005593295 systemd[1]: Finished Coldplug All udev Devices.
Jan 23 04:00:49 np0005593295 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 04:00:49 np0005593295 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 04:00:49 np0005593295 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 04:00:49 np0005593295 systemd[1]: Reached target Network.
Jan 23 04:00:49 np0005593295 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 04:00:49 np0005593295 systemd[1]: Starting dracut initqueue hook...
Jan 23 04:00:49 np0005593295 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 23 04:00:49 np0005593295 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 23 04:00:49 np0005593295 kernel: vda: vda1
Jan 23 04:00:49 np0005593295 systemd-udevd[482]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:00:49 np0005593295 kernel: scsi host0: ata_piix
Jan 23 04:00:49 np0005593295 kernel: scsi host1: ata_piix
Jan 23 04:00:49 np0005593295 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 23 04:00:49 np0005593295 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 23 04:00:50 np0005593295 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 04:00:50 np0005593295 systemd[1]: Reached target Initrd Root Device.
Jan 23 04:00:50 np0005593295 systemd[1]: Mounting Kernel Configuration File System...
Jan 23 04:00:50 np0005593295 systemd[1]: Mounted Kernel Configuration File System.
Jan 23 04:00:50 np0005593295 kernel: ata1: found unknown device (class 0)
Jan 23 04:00:50 np0005593295 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 23 04:00:50 np0005593295 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 23 04:00:50 np0005593295 systemd[1]: Reached target System Initialization.
Jan 23 04:00:50 np0005593295 systemd[1]: Reached target Basic System.
Jan 23 04:00:50 np0005593295 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 23 04:00:50 np0005593295 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 23 04:00:50 np0005593295 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 23 04:00:50 np0005593295 systemd[1]: Finished dracut initqueue hook.
Jan 23 04:00:50 np0005593295 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 04:00:50 np0005593295 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 23 04:00:50 np0005593295 systemd[1]: Reached target Remote File Systems.
Jan 23 04:00:50 np0005593295 systemd[1]: Starting dracut pre-mount hook...
Jan 23 04:00:50 np0005593295 systemd[1]: Finished dracut pre-mount hook.
Jan 23 04:00:50 np0005593295 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 23 04:00:50 np0005593295 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Jan 23 04:00:50 np0005593295 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 04:00:50 np0005593295 systemd[1]: Mounting /sysroot...
Jan 23 04:00:51 np0005593295 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 23 04:00:51 np0005593295 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 23 04:00:51 np0005593295 kernel: XFS (vda1): Ending clean mount
Jan 23 04:00:51 np0005593295 systemd[1]: Mounted /sysroot.
Jan 23 04:00:51 np0005593295 systemd[1]: Reached target Initrd Root File System.
Jan 23 04:00:51 np0005593295 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 23 04:00:51 np0005593295 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 23 04:00:51 np0005593295 systemd[1]: Reached target Initrd File Systems.
Jan 23 04:00:51 np0005593295 systemd[1]: Reached target Initrd Default Target.
Jan 23 04:00:51 np0005593295 systemd[1]: Starting dracut mount hook...
Jan 23 04:00:51 np0005593295 systemd[1]: Finished dracut mount hook.
Jan 23 04:00:51 np0005593295 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 23 04:00:51 np0005593295 rpc.idmapd[447]: exiting on signal 15
Jan 23 04:00:51 np0005593295 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 23 04:00:51 np0005593295 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Network.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Timer Units.
Jan 23 04:00:51 np0005593295 systemd[1]: dbus.socket: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 23 04:00:51 np0005593295 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Initrd Default Target.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Basic System.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Initrd Root Device.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Initrd /usr File System.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Path Units.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Remote File Systems.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Slice Units.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Socket Units.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target System Initialization.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Local File Systems.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Swaps.
Jan 23 04:00:51 np0005593295 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped dracut mount hook.
Jan 23 04:00:51 np0005593295 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped dracut pre-mount hook.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 23 04:00:51 np0005593295 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped dracut initqueue hook.
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Coldplug All udev Devices.
Jan 23 04:00:51 np0005593295 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped dracut pre-trigger hook.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Setup Virtual Console.
Jan 23 04:00:51 np0005593295 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Closed udev Control Socket.
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Closed udev Kernel Socket.
Jan 23 04:00:51 np0005593295 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped dracut pre-udev hook.
Jan 23 04:00:51 np0005593295 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped dracut cmdline hook.
Jan 23 04:00:51 np0005593295 systemd[1]: Starting Cleanup udev Database...
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 23 04:00:51 np0005593295 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 23 04:00:51 np0005593295 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Stopped Create System Users.
Jan 23 04:00:51 np0005593295 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 23 04:00:51 np0005593295 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 23 04:00:51 np0005593295 systemd[1]: Finished Cleanup udev Database.
Jan 23 04:00:51 np0005593295 systemd[1]: Reached target Switch Root.
Jan 23 04:00:51 np0005593295 systemd[1]: Starting Switch Root...
Jan 23 04:00:51 np0005593295 systemd[1]: Switching root.
Jan 23 04:00:51 np0005593295 systemd-journald[306]: Journal stopped
Jan 23 04:00:52 np0005593295 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 23 04:00:52 np0005593295 kernel: audit: type=1404 audit(1769158851.471:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 23 04:00:52 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:00:52 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:00:52 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:00:52 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:00:52 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:00:52 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:00:52 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:00:52 np0005593295 kernel: audit: type=1403 audit(1769158851.596:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 23 04:00:52 np0005593295 systemd: Successfully loaded SELinux policy in 127.604ms.
Jan 23 04:00:52 np0005593295 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.947ms.
Jan 23 04:00:52 np0005593295 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 04:00:52 np0005593295 systemd: Detected virtualization kvm.
Jan 23 04:00:52 np0005593295 systemd: Detected architecture x86-64.
Jan 23 04:00:52 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:00:52 np0005593295 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 23 04:00:52 np0005593295 systemd: Stopped Switch Root.
Jan 23 04:00:52 np0005593295 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 23 04:00:52 np0005593295 systemd: Created slice Slice /system/getty.
Jan 23 04:00:52 np0005593295 systemd: Created slice Slice /system/serial-getty.
Jan 23 04:00:52 np0005593295 systemd: Created slice Slice /system/sshd-keygen.
Jan 23 04:00:52 np0005593295 systemd: Created slice User and Session Slice.
Jan 23 04:00:52 np0005593295 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 04:00:52 np0005593295 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 23 04:00:52 np0005593295 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 23 04:00:52 np0005593295 systemd: Reached target Local Encrypted Volumes.
Jan 23 04:00:52 np0005593295 systemd: Stopped target Switch Root.
Jan 23 04:00:52 np0005593295 systemd: Stopped target Initrd File Systems.
Jan 23 04:00:52 np0005593295 systemd: Stopped target Initrd Root File System.
Jan 23 04:00:52 np0005593295 systemd: Reached target Local Integrity Protected Volumes.
Jan 23 04:00:52 np0005593295 systemd: Reached target Path Units.
Jan 23 04:00:52 np0005593295 systemd: Reached target rpc_pipefs.target.
Jan 23 04:00:52 np0005593295 systemd: Reached target Slice Units.
Jan 23 04:00:52 np0005593295 systemd: Reached target Swaps.
Jan 23 04:00:52 np0005593295 systemd: Reached target Local Verity Protected Volumes.
Jan 23 04:00:52 np0005593295 systemd: Listening on RPCbind Server Activation Socket.
Jan 23 04:00:52 np0005593295 systemd: Reached target RPC Port Mapper.
Jan 23 04:00:52 np0005593295 systemd: Listening on Process Core Dump Socket.
Jan 23 04:00:52 np0005593295 systemd: Listening on initctl Compatibility Named Pipe.
Jan 23 04:00:52 np0005593295 systemd: Listening on udev Control Socket.
Jan 23 04:00:52 np0005593295 systemd: Listening on udev Kernel Socket.
Jan 23 04:00:52 np0005593295 systemd: Mounting Huge Pages File System...
Jan 23 04:00:52 np0005593295 systemd: Mounting POSIX Message Queue File System...
Jan 23 04:00:52 np0005593295 systemd: Mounting Kernel Debug File System...
Jan 23 04:00:52 np0005593295 systemd: Mounting Kernel Trace File System...
Jan 23 04:00:52 np0005593295 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 04:00:52 np0005593295 systemd: Starting Create List of Static Device Nodes...
Jan 23 04:00:52 np0005593295 systemd: Starting Load Kernel Module configfs...
Jan 23 04:00:52 np0005593295 systemd: Starting Load Kernel Module drm...
Jan 23 04:00:52 np0005593295 systemd: Starting Load Kernel Module efi_pstore...
Jan 23 04:00:52 np0005593295 systemd: Starting Load Kernel Module fuse...
Jan 23 04:00:52 np0005593295 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 23 04:00:52 np0005593295 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 23 04:00:52 np0005593295 systemd: Stopped File System Check on Root Device.
Jan 23 04:00:52 np0005593295 systemd: Stopped Journal Service.
Jan 23 04:00:52 np0005593295 kernel: fuse: init (API version 7.37)
Jan 23 04:00:52 np0005593295 systemd: Starting Journal Service...
Jan 23 04:00:52 np0005593295 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 04:00:52 np0005593295 systemd: Starting Generate network units from Kernel command line...
Jan 23 04:00:52 np0005593295 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 04:00:52 np0005593295 systemd: Starting Remount Root and Kernel File Systems...
Jan 23 04:00:52 np0005593295 systemd-journald[679]: Journal started
Jan 23 04:00:52 np0005593295 systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 04:00:52 np0005593295 systemd[1]: Queued start job for default target Multi-User System.
Jan 23 04:00:52 np0005593295 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 23 04:00:52 np0005593295 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 23 04:00:52 np0005593295 systemd: Starting Apply Kernel Variables...
Jan 23 04:00:52 np0005593295 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 23 04:00:52 np0005593295 systemd: Starting Coldplug All udev Devices...
Jan 23 04:00:52 np0005593295 systemd: Started Journal Service.
Jan 23 04:00:52 np0005593295 systemd[1]: Mounted Huge Pages File System.
Jan 23 04:00:52 np0005593295 systemd[1]: Mounted POSIX Message Queue File System.
Jan 23 04:00:52 np0005593295 systemd[1]: Mounted Kernel Debug File System.
Jan 23 04:00:52 np0005593295 systemd[1]: Mounted Kernel Trace File System.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 04:00:52 np0005593295 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 04:00:52 np0005593295 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 23 04:00:52 np0005593295 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Load Kernel Module fuse.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Generate network units from Kernel command line.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Apply Kernel Variables.
Jan 23 04:00:52 np0005593295 systemd[1]: Mounting FUSE Control File System...
Jan 23 04:00:52 np0005593295 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Rebuild Hardware Database...
Jan 23 04:00:52 np0005593295 kernel: ACPI: bus type drm_connector registered
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 23 04:00:52 np0005593295 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Load/Save OS Random Seed...
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Create System Users...
Jan 23 04:00:52 np0005593295 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Load Kernel Module drm.
Jan 23 04:00:52 np0005593295 systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 04:00:52 np0005593295 systemd-journald[679]: Received client request to flush runtime journal.
Jan 23 04:00:52 np0005593295 systemd[1]: Mounted FUSE Control File System.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Load/Save OS Random Seed.
Jan 23 04:00:52 np0005593295 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Create System Users.
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Coldplug All udev Devices.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 04:00:52 np0005593295 systemd[1]: Reached target Preparation for Local File Systems.
Jan 23 04:00:52 np0005593295 systemd[1]: Reached target Local File Systems.
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 23 04:00:52 np0005593295 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 23 04:00:52 np0005593295 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 23 04:00:52 np0005593295 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Automatic Boot Loader Update...
Jan 23 04:00:52 np0005593295 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 04:00:52 np0005593295 bootctl[697]: Couldn't find EFI system partition, skipping.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Automatic Boot Loader Update.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Security Auditing Service...
Jan 23 04:00:52 np0005593295 systemd[1]: Starting RPC Bind...
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Rebuild Journal Catalog...
Jan 23 04:00:52 np0005593295 auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 23 04:00:52 np0005593295 auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 23 04:00:52 np0005593295 systemd[1]: Started RPC Bind.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Rebuild Journal Catalog.
Jan 23 04:00:52 np0005593295 augenrules[708]: /sbin/augenrules: No change
Jan 23 04:00:52 np0005593295 augenrules[723]: No rules
Jan 23 04:00:52 np0005593295 augenrules[723]: enabled 1
Jan 23 04:00:52 np0005593295 augenrules[723]: failure 1
Jan 23 04:00:52 np0005593295 augenrules[723]: pid 703
Jan 23 04:00:52 np0005593295 augenrules[723]: rate_limit 0
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_limit 8192
Jan 23 04:00:52 np0005593295 augenrules[723]: lost 0
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog 2
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_wait_time 60000
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_wait_time_actual 0
Jan 23 04:00:52 np0005593295 augenrules[723]: enabled 1
Jan 23 04:00:52 np0005593295 augenrules[723]: failure 1
Jan 23 04:00:52 np0005593295 augenrules[723]: pid 703
Jan 23 04:00:52 np0005593295 augenrules[723]: rate_limit 0
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_limit 8192
Jan 23 04:00:52 np0005593295 augenrules[723]: lost 0
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog 2
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_wait_time 60000
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_wait_time_actual 0
Jan 23 04:00:52 np0005593295 augenrules[723]: enabled 1
Jan 23 04:00:52 np0005593295 augenrules[723]: failure 1
Jan 23 04:00:52 np0005593295 augenrules[723]: pid 703
Jan 23 04:00:52 np0005593295 augenrules[723]: rate_limit 0
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_limit 8192
Jan 23 04:00:52 np0005593295 augenrules[723]: lost 0
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog 1
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_wait_time 60000
Jan 23 04:00:52 np0005593295 augenrules[723]: backlog_wait_time_actual 0
Jan 23 04:00:52 np0005593295 systemd[1]: Started Security Auditing Service.
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Rebuild Hardware Database.
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 04:00:52 np0005593295 systemd[1]: Starting Update is Completed...
Jan 23 04:00:52 np0005593295 systemd[1]: Finished Update is Completed.
Jan 23 04:00:53 np0005593295 systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 04:00:53 np0005593295 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 04:00:53 np0005593295 systemd[1]: Reached target System Initialization.
Jan 23 04:00:53 np0005593295 systemd[1]: Started dnf makecache --timer.
Jan 23 04:00:53 np0005593295 systemd[1]: Started Daily rotation of log files.
Jan 23 04:00:53 np0005593295 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 23 04:00:53 np0005593295 systemd[1]: Reached target Timer Units.
Jan 23 04:00:53 np0005593295 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 04:00:53 np0005593295 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 23 04:00:53 np0005593295 systemd[1]: Reached target Socket Units.
Jan 23 04:00:53 np0005593295 systemd[1]: Starting D-Bus System Message Bus...
Jan 23 04:00:53 np0005593295 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 04:00:53 np0005593295 systemd[1]: Starting Load Kernel Module configfs...
Jan 23 04:00:53 np0005593295 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 23 04:00:53 np0005593295 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 04:00:53 np0005593295 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 04:00:53 np0005593295 systemd-udevd[734]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:00:53 np0005593295 systemd[1]: Started D-Bus System Message Bus.
Jan 23 04:00:53 np0005593295 systemd[1]: Reached target Basic System.
Jan 23 04:00:53 np0005593295 dbus-broker-lau[744]: Ready
Jan 23 04:00:53 np0005593295 systemd[1]: Starting NTP client/server...
Jan 23 04:00:53 np0005593295 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 23 04:00:53 np0005593295 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 23 04:00:53 np0005593295 systemd[1]: Starting IPv4 firewall with iptables...
Jan 23 04:00:53 np0005593295 systemd[1]: Started irqbalance daemon.
Jan 23 04:00:53 np0005593295 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 23 04:00:53 np0005593295 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:00:53 np0005593295 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:00:53 np0005593295 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:00:53 np0005593295 systemd[1]: Reached target sshd-keygen.target.
Jan 23 04:00:53 np0005593295 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 23 04:00:53 np0005593295 systemd[1]: Reached target User and Group Name Lookups.
Jan 23 04:00:53 np0005593295 systemd[1]: Starting User Login Management...
Jan 23 04:00:53 np0005593295 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 23 04:00:53 np0005593295 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 23 04:00:53 np0005593295 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 23 04:00:53 np0005593295 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 23 04:00:53 np0005593295 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 23 04:00:53 np0005593295 chronyd[794]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 04:00:53 np0005593295 chronyd[794]: Loaded 0 symmetric keys
Jan 23 04:00:53 np0005593295 chronyd[794]: Using right/UTC timezone to obtain leap second data
Jan 23 04:00:53 np0005593295 chronyd[794]: Loaded seccomp filter (level 2)
Jan 23 04:00:53 np0005593295 systemd[1]: Started NTP client/server.
Jan 23 04:00:53 np0005593295 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 23 04:00:53 np0005593295 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 23 04:00:53 np0005593295 systemd-logind[786]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 04:00:53 np0005593295 systemd-logind[786]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 04:00:53 np0005593295 systemd-logind[786]: New seat seat0.
Jan 23 04:00:53 np0005593295 systemd[1]: Started User Login Management.
Jan 23 04:00:53 np0005593295 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 23 04:00:53 np0005593295 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 23 04:00:53 np0005593295 kernel: kvm_amd: TSC scaling supported
Jan 23 04:00:53 np0005593295 kernel: kvm_amd: Nested Virtualization enabled
Jan 23 04:00:53 np0005593295 kernel: kvm_amd: Nested Paging enabled
Jan 23 04:00:53 np0005593295 kernel: kvm_amd: LBR virtualization supported
Jan 23 04:00:53 np0005593295 kernel: Console: switching to colour dummy device 80x25
Jan 23 04:00:53 np0005593295 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 23 04:00:53 np0005593295 kernel: [drm] features: -context_init
Jan 23 04:00:53 np0005593295 kernel: [drm] number of scanouts: 1
Jan 23 04:00:53 np0005593295 kernel: [drm] number of cap sets: 0
Jan 23 04:00:53 np0005593295 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 23 04:00:53 np0005593295 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 23 04:00:53 np0005593295 kernel: Console: switching to colour frame buffer device 128x48
Jan 23 04:00:53 np0005593295 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 23 04:00:53 np0005593295 iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Jan 23 04:00:53 np0005593295 systemd[1]: Finished IPv4 firewall with iptables.
Jan 23 04:00:53 np0005593295 cloud-init[840]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 23 Jan 2026 09:00:53 +0000. Up 6.31 seconds.
Jan 23 04:00:53 np0005593295 systemd[1]: run-cloud\x2dinit-tmp-tmpgb1h963_.mount: Deactivated successfully.
Jan 23 04:00:53 np0005593295 systemd[1]: Starting Hostname Service...
Jan 23 04:00:53 np0005593295 systemd[1]: Started Hostname Service.
Jan 23 04:00:53 np0005593295 systemd-hostnamed[854]: Hostname set to <np0005593295.novalocal> (static)
Jan 23 04:00:54 np0005593295 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 23 04:00:54 np0005593295 systemd[1]: Reached target Preparation for Network.
Jan 23 04:00:54 np0005593295 systemd[1]: Starting Network Manager...
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.0905] NetworkManager (version 1.54.3-2.el9) is starting... (boot:20df1b08-a5ba-4a35-8d47-00aa8e9b2616)
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.0911] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.0982] manager[0x5566bace9000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1015] hostname: hostname: using hostnamed
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1016] hostname: static hostname changed from (none) to "np0005593295.novalocal"
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1021] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1115] manager[0x5566bace9000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1116] manager[0x5566bace9000]: rfkill: WWAN hardware radio set enabled
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1156] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1157] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1157] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1158] manager: Networking is enabled by state file
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1160] settings: Loaded settings plugin: keyfile (internal)
Jan 23 04:00:54 np0005593295 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1187] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1205] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1215] dhcp: init: Using DHCP client 'internal'
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1218] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1232] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1239] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1246] device (lo): Activation: starting connection 'lo' (a94dd518-f501-4cf9-bb13-731d2edd38ea)
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1254] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1256] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1282] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1288] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1290] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1292] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1294] device (eth0): carrier: link connected
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1297] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1302] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1306] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1310] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1311] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1312] manager: NetworkManager state is now CONNECTING
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1313] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1319] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1321] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:00:54 np0005593295 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:00:54 np0005593295 systemd[1]: Started Network Manager.
Jan 23 04:00:54 np0005593295 systemd[1]: Reached target Network.
Jan 23 04:00:54 np0005593295 systemd[1]: Starting Network Manager Wait Online...
Jan 23 04:00:54 np0005593295 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 23 04:00:54 np0005593295 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1502] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1506] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.1511] device (lo): Activation: successful, device activated.
Jan 23 04:00:54 np0005593295 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 23 04:00:54 np0005593295 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 04:00:54 np0005593295 systemd[1]: Reached target NFS client services.
Jan 23 04:00:54 np0005593295 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 04:00:54 np0005593295 systemd[1]: Reached target Remote File Systems.
Jan 23 04:00:54 np0005593295 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.3981] dhcp4 (eth0): state changed new lease, address=38.129.56.185
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.3992] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.4008] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.4052] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.4054] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.4056] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.4060] device (eth0): Activation: successful, device activated.
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.4064] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 04:00:54 np0005593295 NetworkManager[858]: <info>  [1769158854.4067] manager: startup complete
Jan 23 04:00:54 np0005593295 systemd[1]: Finished Network Manager Wait Online.
Jan 23 04:00:54 np0005593295 systemd[1]: Starting Cloud-init: Network Stage...
Jan 23 04:00:54 np0005593295 cloud-init[921]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 23 Jan 2026 09:00:54 +0000. Up 7.42 seconds.
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |  eth0  | True |        38.129.56.185         | 255.255.255.0 | global | fa:16:3e:ac:8a:5e |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:feac:8a5e/64 |       .       |  link  | fa:16:3e:ac:8a:5e |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 23 04:00:54 np0005593295 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 04:00:55 np0005593295 cloud-init[921]: Generating public/private rsa key pair.
Jan 23 04:00:55 np0005593295 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 23 04:00:55 np0005593295 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 23 04:00:55 np0005593295 cloud-init[921]: The key fingerprint is:
Jan 23 04:00:55 np0005593295 cloud-init[921]: SHA256:m8AhXO3GeQnorWC7FsKZGOrxihbMoSkDu/Ok29CyBEg root@np0005593295.novalocal
Jan 23 04:00:55 np0005593295 cloud-init[921]: The key's randomart image is:
Jan 23 04:00:55 np0005593295 cloud-init[921]: +---[RSA 3072]----+
Jan 23 04:00:55 np0005593295 cloud-init[921]: |      .o         |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |   . .. o        |
Jan 23 04:00:55 np0005593295 cloud-init[921]: | E  o..+ o .     |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |=.  oo..* o      |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |O*.+ oooS.       |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |X*= o .. o       |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |=+*. o  o        |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |=X .o            |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |*=+.             |
Jan 23 04:00:55 np0005593295 cloud-init[921]: +----[SHA256]-----+
Jan 23 04:00:55 np0005593295 cloud-init[921]: Generating public/private ecdsa key pair.
Jan 23 04:00:55 np0005593295 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 23 04:00:55 np0005593295 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 23 04:00:55 np0005593295 cloud-init[921]: The key fingerprint is:
Jan 23 04:00:55 np0005593295 cloud-init[921]: SHA256:qp30oz4J6SGx4H6Mw94XvmoahRSp8LSPmsJ+PzjTuFo root@np0005593295.novalocal
Jan 23 04:00:55 np0005593295 cloud-init[921]: The key's randomart image is:
Jan 23 04:00:55 np0005593295 cloud-init[921]: +---[ECDSA 256]---+
Jan 23 04:00:55 np0005593295 cloud-init[921]: | ..              |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |..o              |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |o+ .             |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |+ =              |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |.o * .  S        |
Jan 23 04:00:55 np0005593295 cloud-init[921]: | .= =. .         |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |++oE=o+.         |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |+*+X.Ooo.        |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |+=B+O+Bo..       |
Jan 23 04:00:55 np0005593295 cloud-init[921]: +----[SHA256]-----+
Jan 23 04:00:55 np0005593295 cloud-init[921]: Generating public/private ed25519 key pair.
Jan 23 04:00:55 np0005593295 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 23 04:00:55 np0005593295 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 23 04:00:55 np0005593295 cloud-init[921]: The key fingerprint is:
Jan 23 04:00:55 np0005593295 cloud-init[921]: SHA256:AWwv715Qz77GtzINhRcbXo5PsXIeyKYGXwjonJlEnTQ root@np0005593295.novalocal
Jan 23 04:00:55 np0005593295 cloud-init[921]: The key's randomart image is:
Jan 23 04:00:55 np0005593295 cloud-init[921]: +--[ED25519 256]--+
Jan 23 04:00:55 np0005593295 cloud-init[921]: |     .o.+E.      |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |      o+ +.   o..|
Jan 23 04:00:55 np0005593295 cloud-init[921]: |     .+.= o oo.Bo|
Jan 23 04:00:55 np0005593295 cloud-init[921]: |      .*.+ +.**+o|
Jan 23 04:00:55 np0005593295 cloud-init[921]: |       oS o *o+o.|
Jan 23 04:00:55 np0005593295 cloud-init[921]: |        .. =.  ..|
Jan 23 04:00:55 np0005593295 cloud-init[921]: |       .  o..o   |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |        ..  =.o  |
Jan 23 04:00:55 np0005593295 cloud-init[921]: |       ..  ..+.. |
Jan 23 04:00:55 np0005593295 cloud-init[921]: +----[SHA256]-----+
Jan 23 04:00:56 np0005593295 sm-notify[1003]: Version 2.5.4 starting
Jan 23 04:00:56 np0005593295 systemd[1]: Finished Cloud-init: Network Stage.
Jan 23 04:00:56 np0005593295 systemd[1]: Reached target Cloud-config availability.
Jan 23 04:00:56 np0005593295 systemd[1]: Reached target Network is Online.
Jan 23 04:00:56 np0005593295 systemd[1]: Starting Cloud-init: Config Stage...
Jan 23 04:00:56 np0005593295 systemd[1]: Starting Crash recovery kernel arming...
Jan 23 04:00:56 np0005593295 systemd[1]: Starting Notify NFS peers of a restart...
Jan 23 04:00:56 np0005593295 systemd[1]: Starting System Logging Service...
Jan 23 04:00:56 np0005593295 systemd[1]: Starting OpenSSH server daemon...
Jan 23 04:00:56 np0005593295 systemd[1]: Starting Permit User Sessions...
Jan 23 04:00:56 np0005593295 systemd[1]: Started Notify NFS peers of a restart.
Jan 23 04:00:56 np0005593295 systemd[1]: Finished Permit User Sessions.
Jan 23 04:00:56 np0005593295 systemd[1]: Started OpenSSH server daemon.
Jan 23 04:00:56 np0005593295 systemd[1]: Started Command Scheduler.
Jan 23 04:00:56 np0005593295 systemd[1]: Started Getty on tty1.
Jan 23 04:00:56 np0005593295 systemd[1]: Started Serial Getty on ttyS0.
Jan 23 04:00:56 np0005593295 systemd[1]: Reached target Login Prompts.
Jan 23 04:00:56 np0005593295 rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Jan 23 04:00:56 np0005593295 rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 23 04:00:56 np0005593295 systemd[1]: Started System Logging Service.
Jan 23 04:00:56 np0005593295 systemd[1]: Reached target Multi-User System.
Jan 23 04:00:56 np0005593295 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 23 04:00:56 np0005593295 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 23 04:00:56 np0005593295 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 23 04:00:56 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:00:56 np0005593295 kdumpctl[1019]: kdump: No kdump initial ramdisk found.
Jan 23 04:00:56 np0005593295 kdumpctl[1019]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 23 04:00:56 np0005593295 cloud-init[1188]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 23 Jan 2026 09:00:56 +0000. Up 9.08 seconds.
Jan 23 04:00:56 np0005593295 systemd[1]: Finished Cloud-init: Config Stage.
Jan 23 04:00:56 np0005593295 systemd[1]: Starting Cloud-init: Final Stage...
Jan 23 04:00:56 np0005593295 dracut[1284]: dracut-057-102.git20250818.el9
Jan 23 04:00:56 np0005593295 dracut[1286]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 23 04:00:56 np0005593295 cloud-init[1349]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 23 Jan 2026 09:00:56 +0000. Up 9.54 seconds.
Jan 23 04:00:56 np0005593295 cloud-init[1362]: #############################################################
Jan 23 04:00:56 np0005593295 cloud-init[1366]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 23 04:00:56 np0005593295 cloud-init[1368]: 256 SHA256:qp30oz4J6SGx4H6Mw94XvmoahRSp8LSPmsJ+PzjTuFo root@np0005593295.novalocal (ECDSA)
Jan 23 04:00:56 np0005593295 cloud-init[1373]: 256 SHA256:AWwv715Qz77GtzINhRcbXo5PsXIeyKYGXwjonJlEnTQ root@np0005593295.novalocal (ED25519)
Jan 23 04:00:56 np0005593295 cloud-init[1375]: 3072 SHA256:m8AhXO3GeQnorWC7FsKZGOrxihbMoSkDu/Ok29CyBEg root@np0005593295.novalocal (RSA)
Jan 23 04:00:56 np0005593295 cloud-init[1377]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 23 04:00:56 np0005593295 cloud-init[1380]: #############################################################
Jan 23 04:00:57 np0005593295 cloud-init[1349]: Cloud-init v. 24.4-8.el9 finished at Fri, 23 Jan 2026 09:00:57 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.74 seconds
Jan 23 04:00:57 np0005593295 systemd[1]: Finished Cloud-init: Final Stage.
Jan 23 04:00:57 np0005593295 systemd[1]: Reached target Cloud-init target.
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: memstrack is not available
Jan 23 04:00:57 np0005593295 dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 04:00:57 np0005593295 dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 04:00:58 np0005593295 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 04:00:58 np0005593295 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 04:00:58 np0005593295 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 04:00:58 np0005593295 dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 04:00:58 np0005593295 dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 04:00:58 np0005593295 dracut[1286]: memstrack is not available
Jan 23 04:00:58 np0005593295 dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 04:00:58 np0005593295 dracut[1286]: *** Including module: systemd ***
Jan 23 04:00:58 np0005593295 dracut[1286]: *** Including module: fips ***
Jan 23 04:00:58 np0005593295 dracut[1286]: *** Including module: systemd-initrd ***
Jan 23 04:00:58 np0005593295 dracut[1286]: *** Including module: i18n ***
Jan 23 04:00:58 np0005593295 dracut[1286]: *** Including module: drm ***
Jan 23 04:00:59 np0005593295 chronyd[794]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Jan 23 04:00:59 np0005593295 chronyd[794]: System clock TAI offset set to 37 seconds
Jan 23 04:00:59 np0005593295 dracut[1286]: *** Including module: prefixdevname ***
Jan 23 04:00:59 np0005593295 dracut[1286]: *** Including module: kernel-modules ***
Jan 23 04:00:59 np0005593295 kernel: block vda: the capability attribute has been deprecated.
Jan 23 04:00:59 np0005593295 dracut[1286]: *** Including module: kernel-modules-extra ***
Jan 23 04:00:59 np0005593295 dracut[1286]: *** Including module: qemu ***
Jan 23 04:00:59 np0005593295 dracut[1286]: *** Including module: fstab-sys ***
Jan 23 04:00:59 np0005593295 dracut[1286]: *** Including module: rootfs-block ***
Jan 23 04:00:59 np0005593295 dracut[1286]: *** Including module: terminfo ***
Jan 23 04:01:00 np0005593295 dracut[1286]: *** Including module: udev-rules ***
Jan 23 04:01:00 np0005593295 dracut[1286]: Skipping udev rule: 91-permissions.rules
Jan 23 04:01:00 np0005593295 dracut[1286]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 23 04:01:00 np0005593295 dracut[1286]: *** Including module: virtiofs ***
Jan 23 04:01:00 np0005593295 dracut[1286]: *** Including module: dracut-systemd ***
Jan 23 04:01:00 np0005593295 dracut[1286]: *** Including module: usrmount ***
Jan 23 04:01:00 np0005593295 dracut[1286]: *** Including module: base ***
Jan 23 04:01:00 np0005593295 dracut[1286]: *** Including module: fs-lib ***
Jan 23 04:01:00 np0005593295 dracut[1286]: *** Including module: kdumpbase ***
Jan 23 04:01:01 np0005593295 dracut[1286]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 23 04:01:01 np0005593295 dracut[1286]:  microcode_ctl module: mangling fw_dir
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 23 04:01:01 np0005593295 dracut[1286]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 23 04:01:01 np0005593295 dracut[1286]: *** Including module: openssl ***
Jan 23 04:01:01 np0005593295 dracut[1286]: *** Including module: shutdown ***
Jan 23 04:01:01 np0005593295 dracut[1286]: *** Including module: squash ***
Jan 23 04:01:01 np0005593295 dracut[1286]: *** Including modules done ***
Jan 23 04:01:01 np0005593295 dracut[1286]: *** Installing kernel module dependencies ***
Jan 23 04:01:02 np0005593295 dracut[1286]: *** Installing kernel module dependencies done ***
Jan 23 04:01:02 np0005593295 dracut[1286]: *** Resolving executable dependencies ***
Jan 23 04:01:03 np0005593295 irqbalance[780]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 23 04:01:03 np0005593295 irqbalance[780]: IRQ 35 affinity is now unmanaged
Jan 23 04:01:03 np0005593295 irqbalance[780]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 23 04:01:03 np0005593295 irqbalance[780]: IRQ 33 affinity is now unmanaged
Jan 23 04:01:03 np0005593295 irqbalance[780]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 23 04:01:03 np0005593295 irqbalance[780]: IRQ 31 affinity is now unmanaged
Jan 23 04:01:03 np0005593295 irqbalance[780]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 23 04:01:03 np0005593295 irqbalance[780]: IRQ 28 affinity is now unmanaged
Jan 23 04:01:03 np0005593295 irqbalance[780]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 23 04:01:03 np0005593295 irqbalance[780]: IRQ 34 affinity is now unmanaged
Jan 23 04:01:03 np0005593295 irqbalance[780]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 23 04:01:03 np0005593295 irqbalance[780]: IRQ 32 affinity is now unmanaged
Jan 23 04:01:03 np0005593295 irqbalance[780]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 23 04:01:03 np0005593295 irqbalance[780]: IRQ 30 affinity is now unmanaged
Jan 23 04:01:03 np0005593295 irqbalance[780]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 23 04:01:03 np0005593295 irqbalance[780]: IRQ 29 affinity is now unmanaged
Jan 23 04:01:03 np0005593295 dracut[1286]: *** Resolving executable dependencies done ***
Jan 23 04:01:03 np0005593295 dracut[1286]: *** Generating early-microcode cpio image ***
Jan 23 04:01:03 np0005593295 dracut[1286]: *** Store current command line parameters ***
Jan 23 04:01:03 np0005593295 dracut[1286]: Stored kernel commandline:
Jan 23 04:01:03 np0005593295 dracut[1286]: No dracut internal kernel commandline stored in the initramfs
Jan 23 04:01:04 np0005593295 dracut[1286]: *** Install squash loader ***
Jan 23 04:01:04 np0005593295 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:01:05 np0005593295 dracut[1286]: *** Squashing the files inside the initramfs ***
Jan 23 04:01:06 np0005593295 dracut[1286]: *** Squashing the files inside the initramfs done ***
Jan 23 04:01:06 np0005593295 dracut[1286]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 23 04:01:06 np0005593295 dracut[1286]: *** Hardlinking files ***
Jan 23 04:01:06 np0005593295 dracut[1286]: *** Hardlinking files done ***
Jan 23 04:01:06 np0005593295 dracut[1286]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 23 04:01:07 np0005593295 kdumpctl[1019]: kdump: kexec: loaded kdump kernel
Jan 23 04:01:07 np0005593295 kdumpctl[1019]: kdump: Starting kdump: [OK]
Jan 23 04:01:07 np0005593295 systemd[1]: Finished Crash recovery kernel arming.
Jan 23 04:01:07 np0005593295 systemd[1]: Startup finished in 1.630s (kernel) + 2.548s (initrd) + 15.629s (userspace) = 19.809s.
Jan 23 04:01:24 np0005593295 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 04:01:44 np0005593295 systemd[1]: Created slice User Slice of UID 1000.
Jan 23 04:01:44 np0005593295 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 23 04:01:44 np0005593295 systemd-logind[786]: New session 1 of user zuul.
Jan 23 04:01:44 np0005593295 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 23 04:01:44 np0005593295 systemd[1]: Starting User Manager for UID 1000...
Jan 23 04:01:44 np0005593295 systemd[4325]: Queued start job for default target Main User Target.
Jan 23 04:01:44 np0005593295 systemd[4325]: Created slice User Application Slice.
Jan 23 04:01:44 np0005593295 systemd[4325]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:01:44 np0005593295 systemd[4325]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:01:44 np0005593295 systemd[4325]: Reached target Paths.
Jan 23 04:01:44 np0005593295 systemd[4325]: Reached target Timers.
Jan 23 04:01:44 np0005593295 systemd[4325]: Starting D-Bus User Message Bus Socket...
Jan 23 04:01:44 np0005593295 systemd[4325]: Starting Create User's Volatile Files and Directories...
Jan 23 04:01:44 np0005593295 systemd[4325]: Finished Create User's Volatile Files and Directories.
Jan 23 04:01:44 np0005593295 systemd[4325]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:01:44 np0005593295 systemd[4325]: Reached target Sockets.
Jan 23 04:01:44 np0005593295 systemd[4325]: Reached target Basic System.
Jan 23 04:01:44 np0005593295 systemd[4325]: Reached target Main User Target.
Jan 23 04:01:44 np0005593295 systemd[4325]: Startup finished in 125ms.
Jan 23 04:01:44 np0005593295 systemd[1]: Started User Manager for UID 1000.
Jan 23 04:01:44 np0005593295 systemd[1]: Started Session 1 of User zuul.
Jan 23 04:01:45 np0005593295 python3[4407]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:01:49 np0005593295 python3[4435]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:01:55 np0005593295 python3[4493]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:01:56 np0005593295 python3[4533]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 23 04:01:58 np0005593295 python3[4559]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQChWBsfs5FtlYIS47KhLNXtsYVhP6UT/w4WYq1l1d/b7+cXPAwAb4Qt1cc/BmNcKM419a6D+CvPejxC67s0h4ksuceBjB/s6b88/zjf8Lio8Dd87f6J+f6IY8ByYIQ8s3Hvn6z0K7HSyEMuQ0B/CLxeBW4MJFqcoLK2v7Y8SNPGLr8w/8y79OWnJJPKmfM4ACTo2JwqmPGI/4+LQsCZS/p/yKDTO5AYxsIUwWw/IX3Jxs67UOBqa40onmgM/VRkfGY512fziVUNkmFHG2Aqgosbpbz/XysrVTpvLRA/H2zpGbbTbuEg6xp8vHQO5V0csAd6p3cdOixjdaPmf9oy3+yXuIeWwnnxPHqvVDY6N9aaIX4vuajxOoMUFiQ2YtcDq7sCn8HoateyYgIL/u2+pInArUiYGemyMEWja0DhD6UdCkY0Ea+YDWeIZKM505N+HClR5jfjjVW35TndY+AldV5OhOzMRmPjtJYS8a0usUXRvmxRfMFSmO9CI1RfNmod9X0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:01:59 np0005593295 python3[4583]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:01:59 np0005593295 python3[4682]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:00 np0005593295 python3[4753]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158919.5010436-254-168213695087426/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=79d1f7c5e92f4d57bb17665cf28be8d8_id_rsa follow=False checksum=70fc72f3adde7c23bd22f0e2ad4ebdd2e15c011a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:00 np0005593295 python3[4876]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:01 np0005593295 python3[4947]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158920.5483115-309-25150256250709/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=79d1f7c5e92f4d57bb17665cf28be8d8_id_rsa.pub follow=False checksum=1817e5216c13f90f69486a375706d090e99f2d79 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:02 np0005593295 python3[4995]: ansible-ping Invoked with data=pong
Jan 23 04:02:03 np0005593295 python3[5019]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:02:05 np0005593295 python3[5077]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 23 04:02:07 np0005593295 python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:07 np0005593295 python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:07 np0005593295 python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:07 np0005593295 python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:08 np0005593295 python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:08 np0005593295 python3[5229]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:10 np0005593295 python3[5255]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:10 np0005593295 python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:11 np0005593295 python3[5406]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158930.469577-34-237662813535663/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:12 np0005593295 python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:12 np0005593295 python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:12 np0005593295 python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:12 np0005593295 python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:13 np0005593295 python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:13 np0005593295 python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:13 np0005593295 python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:13 np0005593295 python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:14 np0005593295 python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:14 np0005593295 python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:14 np0005593295 python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:14 np0005593295 python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:15 np0005593295 python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:15 np0005593295 python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:15 np0005593295 python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:16 np0005593295 python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:16 np0005593295 python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:16 np0005593295 python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:16 np0005593295 python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:17 np0005593295 python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:17 np0005593295 python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:17 np0005593295 python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:18 np0005593295 python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:18 np0005593295 python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:18 np0005593295 python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:18 np0005593295 python3[6054]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:02:21 np0005593295 python3[6080]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 04:02:21 np0005593295 systemd[1]: Starting Time & Date Service...
Jan 23 04:02:22 np0005593295 systemd[1]: Started Time & Date Service.
Jan 23 04:02:22 np0005593295 systemd-timedated[6082]: Changed time zone to 'UTC' (UTC).
Jan 23 04:02:22 np0005593295 python3[6111]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:22 np0005593295 python3[6187]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:23 np0005593295 python3[6258]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769158942.6668499-254-128377911828478/source _original_basename=tmp452_6lu8 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:23 np0005593295 python3[6358]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:24 np0005593295 python3[6429]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769158943.5654054-304-117808605601701/source _original_basename=tmpiaqmet1f follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:24 np0005593295 python3[6531]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:25 np0005593295 python3[6604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769158944.7552545-384-176906075530815/source _original_basename=tmpgty3_bc_ follow=False checksum=8f68793d163f2a5535dcdbaa3731e7670c26af6c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:26 np0005593295 python3[6652]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:02:26 np0005593295 python3[6678]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:02:27 np0005593295 python3[6758]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:02:27 np0005593295 python3[6831]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158947.3532264-454-71423814110502/source _original_basename=tmpdkzgjeqd follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:28 np0005593295 python3[6882]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-639e-86bd-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:02:29 np0005593295 python3[6910]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-639e-86bd-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 23 04:02:30 np0005593295 python3[6938]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:48 np0005593295 python3[6964]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:02:52 np0005593295 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 04:03:46 np0005593295 systemd[4325]: Starting Mark boot as successful...
Jan 23 04:03:46 np0005593295 systemd[4325]: Finished Mark boot as successful.
Jan 23 04:03:48 np0005593295 systemd-logind[786]: Session 1 logged out. Waiting for processes to exit.
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 23 04:04:18 np0005593295 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 23 04:04:18 np0005593295 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4490] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 04:04:18 np0005593295 systemd-udevd[6969]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4637] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4661] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4663] device (eth1): carrier: link connected
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4665] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4671] policy: auto-activating connection 'Wired connection 1' (52728e87-b91d-3812-9239-09489880e5d3)
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4675] device (eth1): Activation: starting connection 'Wired connection 1' (52728e87-b91d-3812-9239-09489880e5d3)
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4675] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4678] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4681] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:04:18 np0005593295 NetworkManager[858]: <info>  [1769159058.4685] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:04:19 np0005593295 systemd-logind[786]: New session 3 of user zuul.
Jan 23 04:04:19 np0005593295 systemd[1]: Started Session 3 of User zuul.
Jan 23 04:04:19 np0005593295 python3[6999]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-4543-3693-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:04:29 np0005593295 python3[7079]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:04:29 np0005593295 python3[7152]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159069.1411424-206-135901253566715/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=9da3c4bf865aebe0db6de516128830b2cb557851 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:04:30 np0005593295 python3[7202]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:04:30 np0005593295 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 04:04:30 np0005593295 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 04:04:30 np0005593295 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 04:04:30 np0005593295 NetworkManager[858]: <info>  [1769159070.2515] caught SIGTERM, shutting down normally.
Jan 23 04:04:30 np0005593295 systemd[1]: Stopping Network Manager...
Jan 23 04:04:30 np0005593295 NetworkManager[858]: <info>  [1769159070.2523] dhcp4 (eth0): canceled DHCP transaction
Jan 23 04:04:30 np0005593295 NetworkManager[858]: <info>  [1769159070.2525] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:04:30 np0005593295 NetworkManager[858]: <info>  [1769159070.2525] dhcp4 (eth0): state changed no lease
Jan 23 04:04:30 np0005593295 NetworkManager[858]: <info>  [1769159070.2526] manager: NetworkManager state is now CONNECTING
Jan 23 04:04:30 np0005593295 NetworkManager[858]: <info>  [1769159070.2588] dhcp4 (eth1): canceled DHCP transaction
Jan 23 04:04:30 np0005593295 NetworkManager[858]: <info>  [1769159070.2589] dhcp4 (eth1): state changed no lease
Jan 23 04:04:30 np0005593295 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:04:30 np0005593295 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:04:40 np0005593295 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:04:43 np0005593295 NetworkManager[858]: <info>  [1769159083.9221] exiting (success)
Jan 23 04:04:43 np0005593295 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 04:04:43 np0005593295 systemd[1]: Stopped Network Manager.
Jan 23 04:04:43 np0005593295 systemd[1]: NetworkManager.service: Consumed 1.322s CPU time, 9.9M memory peak.
Jan 23 04:04:43 np0005593295 systemd[1]: Starting Network Manager...
Jan 23 04:04:43 np0005593295 NetworkManager[7219]: <info>  [1769159083.9833] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:20df1b08-a5ba-4a35-8d47-00aa8e9b2616)
Jan 23 04:04:43 np0005593295 NetworkManager[7219]: <info>  [1769159083.9834] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 04:04:43 np0005593295 NetworkManager[7219]: <info>  [1769159083.9884] manager[0x55f8e0457000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 04:04:44 np0005593295 systemd[1]: Starting Hostname Service...
Jan 23 04:04:44 np0005593295 systemd[1]: Started Hostname Service.
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0595] hostname: hostname: using hostnamed
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0596] hostname: static hostname changed from (none) to "np0005593295.novalocal"
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0603] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0609] manager[0x55f8e0457000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0610] manager[0x55f8e0457000]: rfkill: WWAN hardware radio set enabled
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0651] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0652] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0653] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0653] manager: Networking is enabled by state file
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0657] settings: Loaded settings plugin: keyfile (internal)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0663] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0711] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0728] dhcp: init: Using DHCP client 'internal'
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0732] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0742] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0752] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0765] device (lo): Activation: starting connection 'lo' (a94dd518-f501-4cf9-bb13-731d2edd38ea)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0777] device (eth0): carrier: link connected
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0784] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0793] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0794] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0806] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0818] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0828] device (eth1): carrier: link connected
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0834] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0843] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (52728e87-b91d-3812-9239-09489880e5d3) (indicated)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0844] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0854] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0865] device (eth1): Activation: starting connection 'Wired connection 1' (52728e87-b91d-3812-9239-09489880e5d3)
Jan 23 04:04:44 np0005593295 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0876] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 04:04:44 np0005593295 systemd[1]: Started Network Manager.
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0884] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0891] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0894] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0899] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0906] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0911] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0915] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0921] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0933] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0939] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0954] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 systemd[1]: Starting Network Manager Wait Online...
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0959] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:04:44 np0005593295 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.0987] dhcp4 (eth0): state changed new lease, address=38.129.56.185
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.1003] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 04:04:44 np0005593295 python3[7264]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-4543-3693-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8408] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8435] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8445] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8451] device (lo): Activation: successful, device activated.
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8503] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8510] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8519] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8523] device (eth0): Activation: successful, device activated.
Jan 23 04:04:44 np0005593295 NetworkManager[7219]: <info>  [1769159084.8531] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 04:04:54 np0005593295 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:05:14 np0005593295 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.2911] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 04:05:29 np0005593295 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:05:29 np0005593295 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3204] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3207] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3211] device (eth1): Activation: successful, device activated.
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3216] manager: startup complete
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3219] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <warn>  [1769159129.3222] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3228] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 23 04:05:29 np0005593295 systemd[1]: Finished Network Manager Wait Online.
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3318] dhcp4 (eth1): canceled DHCP transaction
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3318] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3318] dhcp4 (eth1): state changed no lease
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3330] policy: auto-activating connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918)
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3334] device (eth1): Activation: starting connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918)
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3335] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3337] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3343] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3349] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3462] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3464] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:05:29 np0005593295 NetworkManager[7219]: <info>  [1769159129.3468] device (eth1): Activation: successful, device activated.
Jan 23 04:05:39 np0005593295 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:05:44 np0005593295 systemd[1]: session-3.scope: Deactivated successfully.
Jan 23 04:05:44 np0005593295 systemd[1]: session-3.scope: Consumed 1.378s CPU time.
Jan 23 04:05:44 np0005593295 systemd-logind[786]: Session 3 logged out. Waiting for processes to exit.
Jan 23 04:05:44 np0005593295 systemd-logind[786]: Removed session 3.
Jan 23 04:05:54 np0005593295 systemd-logind[786]: New session 4 of user zuul.
Jan 23 04:05:54 np0005593295 systemd[1]: Started Session 4 of User zuul.
Jan 23 04:05:54 np0005593295 python3[7399]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:05:55 np0005593295 python3[7472]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159154.5039935-373-168282419267218/source _original_basename=tmpv125ucd2 follow=False checksum=6e1e8970cf6ad2f0b1a32d462d71e8a0528ec2d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:05:57 np0005593295 systemd[1]: session-4.scope: Deactivated successfully.
Jan 23 04:05:57 np0005593295 systemd-logind[786]: Session 4 logged out. Waiting for processes to exit.
Jan 23 04:05:57 np0005593295 systemd-logind[786]: Removed session 4.
Jan 23 04:06:46 np0005593295 systemd[4325]: Created slice User Background Tasks Slice.
Jan 23 04:06:46 np0005593295 systemd[4325]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 04:06:46 np0005593295 systemd[4325]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 04:14:49 np0005593295 systemd-logind[786]: New session 5 of user zuul.
Jan 23 04:14:49 np0005593295 systemd[1]: Started Session 5 of User zuul.
Jan 23 04:14:50 np0005593295 python3[7533]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-5353-1fb2-00000000217f-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:50 np0005593295 python3[7561]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:50 np0005593295 python3[7588]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:51 np0005593295 python3[7614]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:51 np0005593295 python3[7640]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:52 np0005593295 python3[7666]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:52 np0005593295 python3[7744]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:14:53 np0005593295 python3[7817]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159692.5070572-547-243315707134321/source _original_basename=tmprhbvswkc follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:54 np0005593295 python3[7867]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:14:54 np0005593295 systemd[1]: Reloading.
Jan 23 04:14:54 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:56 np0005593295 python3[7923]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 23 04:14:56 np0005593295 python3[7949]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:57 np0005593295 python3[7977]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:57 np0005593295 python3[8005]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:57 np0005593295 python3[8033]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:58 np0005593295 python3[8060]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-5353-1fb2-000000002186-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:59 np0005593295 python3[8090]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 04:15:03 np0005593295 systemd[1]: session-5.scope: Deactivated successfully.
Jan 23 04:15:03 np0005593295 systemd[1]: session-5.scope: Consumed 3.841s CPU time.
Jan 23 04:15:03 np0005593295 systemd-logind[786]: Session 5 logged out. Waiting for processes to exit.
Jan 23 04:15:03 np0005593295 systemd-logind[786]: Removed session 5.
Jan 23 04:15:05 np0005593295 systemd-logind[786]: New session 6 of user zuul.
Jan 23 04:15:05 np0005593295 systemd[1]: Started Session 6 of User zuul.
Jan 23 04:15:05 np0005593295 python3[8124]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 04:15:14 np0005593295 setsebool[8166]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 23 04:15:14 np0005593295 setsebool[8166]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 23 04:15:30 np0005593295 kernel: SELinux:  Converting 386 SID table entries...
Jan 23 04:15:30 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:15:30 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:15:30 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:15:30 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:15:30 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:15:30 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:15:30 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:15:43 np0005593295 kernel: SELinux:  Converting 389 SID table entries...
Jan 23 04:15:43 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:15:43 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:15:43 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:15:43 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:15:43 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:15:43 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:15:43 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:16:00 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 04:16:00 np0005593295 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 23 04:16:00 np0005593295 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 23 04:16:00 np0005593295 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 23 04:16:00 np0005593295 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 23 04:16:00 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:16:01 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:16:01 np0005593295 systemd[1]: Reloading.
Jan 23 04:16:01 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:16:01 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:16:15 np0005593295 python3[17841]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f136-f057-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:16:18 np0005593295 kernel: evm: overlay not supported
Jan 23 04:16:18 np0005593295 systemd[4325]: Starting D-Bus User Message Bus...
Jan 23 04:16:18 np0005593295 dbus-broker-launch[18385]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 23 04:16:18 np0005593295 dbus-broker-launch[18385]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 23 04:16:18 np0005593295 systemd[4325]: Started D-Bus User Message Bus.
Jan 23 04:16:18 np0005593295 dbus-broker-lau[18385]: Ready
Jan 23 04:16:18 np0005593295 systemd[4325]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 04:16:18 np0005593295 systemd[4325]: Created slice Slice /user.
Jan 23 04:16:18 np0005593295 systemd[4325]: podman-18317.scope: unit configures an IP firewall, but not running as root.
Jan 23 04:16:18 np0005593295 systemd[4325]: (This warning is only shown for the first unit using IP firewalling.)
Jan 23 04:16:18 np0005593295 systemd[4325]: Started podman-18317.scope.
Jan 23 04:16:18 np0005593295 systemd[4325]: Started podman-pause-cc5e626e.scope.
Jan 23 04:16:20 np0005593295 systemd[1]: session-6.scope: Deactivated successfully.
Jan 23 04:16:20 np0005593295 systemd[1]: session-6.scope: Consumed 48.066s CPU time.
Jan 23 04:16:20 np0005593295 systemd-logind[786]: Session 6 logged out. Waiting for processes to exit.
Jan 23 04:16:20 np0005593295 systemd-logind[786]: Removed session 6.
Jan 23 04:16:41 np0005593295 systemd-logind[786]: New session 7 of user zuul.
Jan 23 04:16:41 np0005593295 systemd[1]: Started Session 7 of User zuul.
Jan 23 04:16:41 np0005593295 python3[28237]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:16:42 np0005593295 python3[28471]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:16:43 np0005593295 python3[28899]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005593295.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 23 04:16:44 np0005593295 python3[29500]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 04:16:44 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:16:44 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:16:44 np0005593295 systemd[1]: man-db-cache-update.service: Consumed 51.162s CPU time.
Jan 23 04:16:44 np0005593295 systemd[1]: run-r160dbfaf61024b97a14b61df3240348e.service: Deactivated successfully.
Jan 23 04:16:45 np0005593295 python3[29782]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:16:45 np0005593295 python3[29856]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159804.773685-153-263573844720942/source _original_basename=tmpjkqig1x_ follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:16:46 np0005593295 python3[29906]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 23 04:16:46 np0005593295 systemd[1]: Starting Hostname Service...
Jan 23 04:16:46 np0005593295 systemd[1]: Started Hostname Service.
Jan 23 04:16:46 np0005593295 systemd-hostnamed[29910]: Changed pretty hostname to 'compute-2'
Jan 23 04:16:46 np0005593295 systemd-hostnamed[29910]: Hostname set to <compute-2> (static)
Jan 23 04:16:46 np0005593295 NetworkManager[7219]: <info>  [1769159806.5835] hostname: static hostname changed from "np0005593295.novalocal" to "compute-2"
Jan 23 04:16:46 np0005593295 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:16:46 np0005593295 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:16:47 np0005593295 systemd[1]: session-7.scope: Deactivated successfully.
Jan 23 04:16:47 np0005593295 systemd-logind[786]: Session 7 logged out. Waiting for processes to exit.
Jan 23 04:16:47 np0005593295 systemd[1]: session-7.scope: Consumed 2.141s CPU time.
Jan 23 04:16:47 np0005593295 systemd-logind[786]: Removed session 7.
Jan 23 04:16:56 np0005593295 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:17:16 np0005593295 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 04:21:46 np0005593295 systemd-logind[786]: New session 8 of user zuul.
Jan 23 04:21:46 np0005593295 systemd[1]: Started Session 8 of User zuul.
Jan 23 04:21:46 np0005593295 python3[30008]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:21:48 np0005593295 python3[30124]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:49 np0005593295 python3[30197]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:49 np0005593295 python3[30223]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:50 np0005593295 python3[30296]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:50 np0005593295 python3[30322]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:50 np0005593295 python3[30395]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:50 np0005593295 python3[30421]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:51 np0005593295 python3[30494]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:51 np0005593295 python3[30520]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:51 np0005593295 python3[30593]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:52 np0005593295 python3[30619]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:52 np0005593295 python3[30692]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:52 np0005593295 python3[30718]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:21:52 np0005593295 python3[30791]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.7014406-34066-141547719505615/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:05 np0005593295 python3[30839]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:27:05 np0005593295 systemd[1]: session-8.scope: Deactivated successfully.
Jan 23 04:27:05 np0005593295 systemd[1]: session-8.scope: Consumed 4.494s CPU time.
Jan 23 04:27:05 np0005593295 systemd-logind[786]: Session 8 logged out. Waiting for processes to exit.
Jan 23 04:27:05 np0005593295 systemd-logind[786]: Removed session 8.
Jan 23 04:33:46 np0005593295 systemd[1]: Starting dnf makecache...
Jan 23 04:33:46 np0005593295 dnf[30850]: Failed determining last makecache time.
Jan 23 04:33:46 np0005593295 dnf[30850]: delorean-openstack-barbican-42b4c41831408a8e323 316 kB/s |  13 kB     00:00
Jan 23 04:33:46 np0005593295 dnf[30850]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.5 MB/s |  65 kB     00:00
Jan 23 04:33:46 np0005593295 dnf[30850]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.3 MB/s |  32 kB     00:00
Jan 23 04:33:47 np0005593295 dnf[30850]: delorean-python-stevedore-c4acc5639fd2329372142 245 kB/s | 131 kB     00:00
Jan 23 04:33:47 np0005593295 dnf[30850]: delorean-python-cloudkitty-tests-tempest-2c80f8 293 kB/s |  32 kB     00:00
Jan 23 04:33:47 np0005593295 dnf[30850]: delorean-os-refresh-config-9bfc52b5049be2d8de61 7.6 MB/s | 349 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 214 kB/s |  42 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-python-designate-tests-tempest-347fdbc 620 kB/s |  18 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-openstack-glance-1fd12c29b339f30fe823e 600 kB/s |  18 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 891 kB/s |  29 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-openstack-manila-3c01b7181572c95dac462 918 kB/s |  25 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-python-whitebox-neutron-tests-tempest- 5.5 MB/s | 154 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-openstack-octavia-ba397f07a7331190208c 221 kB/s |  26 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-openstack-watcher-c014f81a8647287f6dcc 399 kB/s |  16 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-ansible-config_template-5ccaa22121a7ff 218 kB/s | 7.4 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 2.5 MB/s | 144 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-openstack-swift-dc98a8463506ac520c469a 301 kB/s |  14 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-python-tempestconf-8515371b7cceebd4282 758 kB/s |  53 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: delorean-openstack-heat-ui-013accbfd179753bc3f0 2.6 MB/s |  96 kB     00:00
Jan 23 04:33:48 np0005593295 dnf[30850]: CentOS Stream 9 - BaseOS                         67 kB/s | 6.7 kB     00:00
Jan 23 04:33:49 np0005593295 dnf[30850]: CentOS Stream 9 - AppStream                      65 kB/s | 6.8 kB     00:00
Jan 23 04:33:49 np0005593295 dnf[30850]: CentOS Stream 9 - CRB                            58 kB/s | 6.6 kB     00:00
Jan 23 04:33:49 np0005593295 dnf[30850]: CentOS Stream 9 - Extras packages                69 kB/s | 7.3 kB     00:00
Jan 23 04:33:49 np0005593295 dnf[30850]: dlrn-antelope-testing                           7.0 MB/s | 1.1 MB     00:00
Jan 23 04:33:49 np0005593295 dnf[30850]: dlrn-antelope-build-deps                         14 MB/s | 461 kB     00:00
Jan 23 04:33:50 np0005593295 dnf[30850]: centos9-rabbitmq                                7.9 MB/s | 123 kB     00:00
Jan 23 04:33:50 np0005593295 dnf[30850]: centos9-storage                                  19 MB/s | 415 kB     00:00
Jan 23 04:33:50 np0005593295 dnf[30850]: centos9-opstools                                3.9 MB/s |  51 kB     00:00
Jan 23 04:33:50 np0005593295 dnf[30850]: NFV SIG OpenvSwitch                              22 MB/s | 461 kB     00:00
Jan 23 04:33:52 np0005593295 dnf[30850]: repo-setup-centos-appstream                      18 MB/s |  26 MB     00:01
Jan 23 04:33:58 np0005593295 dnf[30850]: repo-setup-centos-baseos                         30 MB/s | 8.9 MB     00:00
Jan 23 04:34:00 np0005593295 dnf[30850]: repo-setup-centos-highavailability               22 MB/s | 744 kB     00:00
Jan 23 04:34:00 np0005593295 dnf[30850]: repo-setup-centos-powertools                     37 MB/s | 7.6 MB     00:00
Jan 23 04:34:03 np0005593295 dnf[30850]: Extra Packages for Enterprise Linux 9 - x86_64   18 MB/s |  20 MB     00:01
Jan 23 04:34:21 np0005593295 dnf[30850]: Metadata cache created.
Jan 23 04:34:21 np0005593295 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 04:34:21 np0005593295 systemd[1]: Finished dnf makecache.
Jan 23 04:34:21 np0005593295 systemd[1]: dnf-makecache.service: Consumed 29.511s CPU time.
Jan 23 04:37:14 np0005593295 systemd-logind[786]: New session 9 of user zuul.
Jan 23 04:37:14 np0005593295 systemd[1]: Started Session 9 of User zuul.
Jan 23 04:37:15 np0005593295 python3.9[31106]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:37:16 np0005593295 python3.9[31287]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:37:29 np0005593295 systemd[1]: session-9.scope: Deactivated successfully.
Jan 23 04:37:29 np0005593295 systemd[1]: session-9.scope: Consumed 8.364s CPU time.
Jan 23 04:37:29 np0005593295 systemd-logind[786]: Session 9 logged out. Waiting for processes to exit.
Jan 23 04:37:29 np0005593295 systemd-logind[786]: Removed session 9.
Jan 23 04:37:46 np0005593295 systemd-logind[786]: New session 10 of user zuul.
Jan 23 04:37:46 np0005593295 systemd[1]: Started Session 10 of User zuul.
Jan 23 04:37:47 np0005593295 python3.9[31497]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 04:37:48 np0005593295 python3.9[31671]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:37:49 np0005593295 python3.9[31823]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:37:51 np0005593295 python3.9[31976]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:37:52 np0005593295 python3.9[32128]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:37:52 np0005593295 python3.9[32280]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:37:53 np0005593295 python3.9[32403]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161072.428582-175-40845375630410/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:37:54 np0005593295 python3.9[32555]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:37:55 np0005593295 python3.9[32711]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:37:56 np0005593295 python3.9[32863]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:37:57 np0005593295 python3.9[33013]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:38:00 np0005593295 python3.9[33266]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:38:01 np0005593295 python3.9[33416]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:38:03 np0005593295 python3.9[33570]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:38:04 np0005593295 python3.9[33728]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:38:05 np0005593295 python3.9[33812]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:38:57 np0005593295 systemd[1]: Reloading.
Jan 23 04:38:57 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:38:57 np0005593295 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 23 04:38:58 np0005593295 systemd[1]: Reloading.
Jan 23 04:38:58 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:38:58 np0005593295 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 23 04:38:58 np0005593295 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 23 04:38:58 np0005593295 systemd[1]: Reloading.
Jan 23 04:38:58 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:38:58 np0005593295 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 23 04:38:58 np0005593295 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 04:38:58 np0005593295 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 04:38:58 np0005593295 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 04:40:18 np0005593295 kernel: SELinux:  Converting 2725 SID table entries...
Jan 23 04:40:18 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:40:18 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:40:18 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:40:18 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:40:18 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:40:18 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:40:18 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:40:18 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 23 04:40:18 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:40:18 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:40:18 np0005593295 systemd[1]: Reloading.
Jan 23 04:40:18 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:40:18 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:40:20 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:40:20 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:40:20 np0005593295 systemd[1]: man-db-cache-update.service: Consumed 1.281s CPU time.
Jan 23 04:40:20 np0005593295 systemd[1]: run-rd2a9490f43a342dd977239ca781a8fd0.service: Deactivated successfully.
Jan 23 04:40:32 np0005593295 python3.9[35334]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:40:34 np0005593295 python3.9[35615]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 04:40:35 np0005593295 python3.9[35767]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 04:40:40 np0005593295 python3.9[35920]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:40:44 np0005593295 python3.9[36073]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 04:40:48 np0005593295 python3.9[36225]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:40:49 np0005593295 python3.9[36377]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:40:49 np0005593295 python3.9[36500]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161248.79358-665-99883130303473/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:40:56 np0005593295 python3.9[36652]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:40:57 np0005593295 python3.9[36804]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:40:57 np0005593295 python3.9[36957]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:40:59 np0005593295 python3.9[37109]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 04:41:00 np0005593295 python3.9[37262]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:41:00 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:41:01 np0005593295 python3.9[37421]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 04:41:02 np0005593295 python3.9[37581]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 04:41:03 np0005593295 python3.9[37734]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:41:04 np0005593295 python3.9[37892]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 04:41:06 np0005593295 python3.9[38044]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:41:11 np0005593295 python3.9[38197]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:41:12 np0005593295 python3.9[38349]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:41:12 np0005593295 python3.9[38472]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161271.7453775-1021-210806067985898/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:41:13 np0005593295 python3.9[38624]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:41:13 np0005593295 systemd[1]: Starting Load Kernel Modules...
Jan 23 04:41:13 np0005593295 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 23 04:41:13 np0005593295 kernel: Bridge firewalling registered
Jan 23 04:41:13 np0005593295 systemd-modules-load[38628]: Inserted module 'br_netfilter'
Jan 23 04:41:13 np0005593295 systemd[1]: Finished Load Kernel Modules.
Jan 23 04:41:14 np0005593295 python3.9[38783]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:41:15 np0005593295 python3.9[38906]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161274.2259042-1090-119051593418181/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:41:16 np0005593295 python3.9[39058]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:41:20 np0005593295 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 04:41:20 np0005593295 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 04:41:20 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:41:20 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:41:20 np0005593295 systemd[1]: Reloading.
Jan 23 04:41:21 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:41:21 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:41:23 np0005593295 python3.9[40816]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:41:23 np0005593295 python3.9[41892]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 04:41:24 np0005593295 python3.9[42736]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:41:25 np0005593295 python3.9[43140]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:25 np0005593295 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 04:41:26 np0005593295 systemd[1]: Starting Authorization Manager...
Jan 23 04:41:26 np0005593295 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 04:41:26 np0005593295 polkitd[43445]: Started polkitd version 0.117
Jan 23 04:41:26 np0005593295 systemd[1]: Started Authorization Manager.
Jan 23 04:41:27 np0005593295 python3.9[43615]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:41:27 np0005593295 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 04:41:28 np0005593295 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 04:41:28 np0005593295 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 04:41:28 np0005593295 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 04:41:28 np0005593295 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 04:41:28 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:41:28 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:41:28 np0005593295 systemd[1]: man-db-cache-update.service: Consumed 4.477s CPU time.
Jan 23 04:41:28 np0005593295 systemd[1]: run-r1b1611b0f9d34e309c0d721a124079b8.service: Deactivated successfully.
Jan 23 04:41:28 np0005593295 python3.9[43778]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 04:41:32 np0005593295 python3.9[43930]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:41:32 np0005593295 systemd[1]: Reloading.
Jan 23 04:41:32 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:41:33 np0005593295 python3.9[44120]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:41:33 np0005593295 systemd[1]: Reloading.
Jan 23 04:41:33 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:41:34 np0005593295 python3.9[44309]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:35 np0005593295 python3.9[44463]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:35 np0005593295 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 23 04:41:36 np0005593295 python3.9[44616]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:38 np0005593295 python3.9[44780]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:41:39 np0005593295 python3.9[44933]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:41:39 np0005593295 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 04:41:39 np0005593295 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 04:41:39 np0005593295 systemd[1]: Stopping Apply Kernel Variables...
Jan 23 04:41:39 np0005593295 systemd[1]: Starting Apply Kernel Variables...
Jan 23 04:41:39 np0005593295 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 04:41:39 np0005593295 systemd[1]: Finished Apply Kernel Variables.
Jan 23 04:41:40 np0005593295 systemd[1]: session-10.scope: Deactivated successfully.
Jan 23 04:41:40 np0005593295 systemd[1]: session-10.scope: Consumed 2min 33.798s CPU time.
Jan 23 04:41:40 np0005593295 systemd-logind[786]: Session 10 logged out. Waiting for processes to exit.
Jan 23 04:41:40 np0005593295 systemd-logind[786]: Removed session 10.
Jan 23 04:41:47 np0005593295 systemd-logind[786]: New session 11 of user zuul.
Jan 23 04:41:47 np0005593295 systemd[1]: Started Session 11 of User zuul.
Jan 23 04:41:48 np0005593295 python3.9[45117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:41:50 np0005593295 python3.9[45273]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 04:41:51 np0005593295 python3.9[45426]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:41:52 np0005593295 python3.9[45584]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 04:41:56 np0005593295 python3.9[45744]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:41:56 np0005593295 python3.9[45828]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:42:00 np0005593295 python3.9[45991]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:42:16 np0005593295 kernel: SELinux:  Converting 2737 SID table entries...
Jan 23 04:42:16 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:42:16 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:42:16 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:42:16 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:42:16 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:42:16 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:42:16 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:42:16 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 23 04:42:16 np0005593295 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 23 04:42:17 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:42:17 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:42:17 np0005593295 systemd[1]: Reloading.
Jan 23 04:42:18 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:42:18 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:42:18 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:42:18 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:42:18 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:42:18 np0005593295 systemd[1]: run-r66549dc4f67642688b1013512a0a2329.service: Deactivated successfully.
Jan 23 04:42:24 np0005593295 python3.9[47090]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:42:24 np0005593295 systemd[1]: Reloading.
Jan 23 04:42:24 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:42:24 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:42:25 np0005593295 systemd[1]: Starting Open vSwitch Database Unit...
Jan 23 04:42:25 np0005593295 chown[47131]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 23 04:42:25 np0005593295 ovs-ctl[47136]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 23 04:42:25 np0005593295 ovs-ctl[47136]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 23 04:42:25 np0005593295 ovs-ctl[47136]: Starting ovsdb-server [  OK  ]
Jan 23 04:42:25 np0005593295 ovs-vsctl[47185]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 23 04:42:25 np0005593295 ovs-vsctl[47201]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"8fb585ea-168c-48ac-870f-617a4fa1bbde\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 23 04:42:25 np0005593295 ovs-ctl[47136]: Configuring Open vSwitch system IDs [  OK  ]
Jan 23 04:42:25 np0005593295 ovs-ctl[47136]: Enabling remote OVSDB managers [  OK  ]
Jan 23 04:42:25 np0005593295 ovs-vsctl[47210]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 23 04:42:25 np0005593295 systemd[1]: Started Open vSwitch Database Unit.
Jan 23 04:42:25 np0005593295 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 23 04:42:25 np0005593295 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 23 04:42:25 np0005593295 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 23 04:42:25 np0005593295 kernel: openvswitch: Open vSwitch switching datapath
Jan 23 04:42:25 np0005593295 ovs-ctl[47255]: Inserting openvswitch module [  OK  ]
Jan 23 04:42:25 np0005593295 ovs-ctl[47224]: Starting ovs-vswitchd [  OK  ]
Jan 23 04:42:25 np0005593295 ovs-vsctl[47272]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 23 04:42:25 np0005593295 ovs-ctl[47224]: Enabling remote OVSDB managers [  OK  ]
Jan 23 04:42:25 np0005593295 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 23 04:42:25 np0005593295 systemd[1]: Starting Open vSwitch...
Jan 23 04:42:25 np0005593295 systemd[1]: Finished Open vSwitch.
Jan 23 04:42:27 np0005593295 python3.9[47424]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:42:28 np0005593295 python3.9[47576]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 04:42:29 np0005593295 kernel: SELinux:  Converting 2751 SID table entries...
Jan 23 04:42:29 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:42:29 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:42:29 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:42:29 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:42:29 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:42:29 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:42:29 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:42:31 np0005593295 python3.9[47731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:42:32 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 23 04:42:32 np0005593295 python3.9[47889]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:42:35 np0005593295 python3.9[48042]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:42:37 np0005593295 python3.9[48329]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 04:42:38 np0005593295 python3.9[48479]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:42:38 np0005593295 python3.9[48633]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:42:41 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:42:41 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:42:41 np0005593295 systemd[1]: Reloading.
Jan 23 04:42:41 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:42:41 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:42:41 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:42:41 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:42:41 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:42:41 np0005593295 systemd[1]: run-rcdd47222d767459e9fc79b43eeb5d29d.service: Deactivated successfully.
Jan 23 04:42:45 np0005593295 python3.9[48951]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:42:45 np0005593295 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 04:42:45 np0005593295 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 04:42:45 np0005593295 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 04:42:45 np0005593295 NetworkManager[7219]: <info>  [1769161365.1061] caught SIGTERM, shutting down normally.
Jan 23 04:42:45 np0005593295 NetworkManager[7219]: <info>  [1769161365.1072] dhcp4 (eth0): canceled DHCP transaction
Jan 23 04:42:45 np0005593295 NetworkManager[7219]: <info>  [1769161365.1072] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:42:45 np0005593295 NetworkManager[7219]: <info>  [1769161365.1072] dhcp4 (eth0): state changed no lease
Jan 23 04:42:45 np0005593295 NetworkManager[7219]: <info>  [1769161365.1074] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 04:42:45 np0005593295 systemd[1]: Stopping Network Manager...
Jan 23 04:42:45 np0005593295 NetworkManager[7219]: <info>  [1769161365.1134] exiting (success)
Jan 23 04:42:45 np0005593295 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:42:45 np0005593295 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:42:45 np0005593295 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 04:42:45 np0005593295 systemd[1]: Stopped Network Manager.
Jan 23 04:42:45 np0005593295 systemd[1]: NetworkManager.service: Consumed 13.287s CPU time, 4.1M memory peak, read 0B from disk, written 32.5K to disk.
Jan 23 04:42:45 np0005593295 systemd[1]: Starting Network Manager...
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.1767] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:20df1b08-a5ba-4a35-8d47-00aa8e9b2616)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.1768] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.1829] manager[0x5622b241e000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 04:42:45 np0005593295 systemd[1]: Starting Hostname Service...
Jan 23 04:42:45 np0005593295 systemd[1]: Started Hostname Service.
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2602] hostname: hostname: using hostnamed
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2603] hostname: static hostname changed from (none) to "compute-2"
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2608] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2612] manager[0x5622b241e000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2613] manager[0x5622b241e000]: rfkill: WWAN hardware radio set enabled
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2638] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2647] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2648] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2649] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2650] manager: Networking is enabled by state file
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2652] settings: Loaded settings plugin: keyfile (internal)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2656] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2685] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2696] dhcp: init: Using DHCP client 'internal'
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2700] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2708] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2713] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2722] device (lo): Activation: starting connection 'lo' (a94dd518-f501-4cf9-bb13-731d2edd38ea)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2729] device (eth0): carrier: link connected
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2733] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2739] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2739] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2746] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2754] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2760] device (eth1): carrier: link connected
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2765] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2771] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918) (indicated)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2771] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2776] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2784] device (eth1): Activation: starting connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918)
Jan 23 04:42:45 np0005593295 systemd[1]: Started Network Manager.
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2792] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2800] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2802] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2803] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2805] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2809] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2812] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2814] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2817] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2825] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2829] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2838] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2850] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2866] dhcp4 (eth0): state changed new lease, address=38.129.56.185
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2873] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2953] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2959] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2961] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2962] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2966] device (lo): Activation: successful, device activated.
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2971] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2975] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2978] device (eth1): Activation: successful, device activated.
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2986] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2989] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2991] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2993] device (eth0): Activation: successful, device activated.
Jan 23 04:42:45 np0005593295 systemd[1]: Starting Network Manager Wait Online...
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2997] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 04:42:45 np0005593295 NetworkManager[48964]: <info>  [1769161365.2999] manager: startup complete
Jan 23 04:42:45 np0005593295 systemd[1]: Finished Network Manager Wait Online.
Jan 23 04:42:46 np0005593295 python3.9[49177]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:42:55 np0005593295 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:42:58 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:42:58 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:42:58 np0005593295 systemd[1]: Reloading.
Jan 23 04:42:58 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:42:58 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:42:58 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:42:59 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:42:59 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:42:59 np0005593295 systemd[1]: run-r938a9d7fd35b453092074315e15b1aeb.service: Deactivated successfully.
Jan 23 04:43:00 np0005593295 python3.9[49637]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:43:01 np0005593295 python3.9[49789]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:02 np0005593295 python3.9[49943]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:02 np0005593295 python3.9[50095]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:03 np0005593295 python3.9[50247]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:04 np0005593295 python3.9[50399]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:04 np0005593295 python3.9[50551]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:05 np0005593295 python3.9[50674]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161384.2416582-644-72844948997566/.source _original_basename=.gxspqu7q follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:06 np0005593295 python3.9[50826]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:07 np0005593295 python3.9[50978]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 23 04:43:07 np0005593295 python3.9[51130]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:10 np0005593295 python3.9[51557]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 23 04:43:11 np0005593295 ansible-async_wrapper.py[51732]: Invoked with j829007530849 300 /home/zuul/.ansible/tmp/ansible-tmp-1769161390.690089-842-208224040620380/AnsiballZ_edpm_os_net_config.py _
Jan 23 04:43:11 np0005593295 ansible-async_wrapper.py[51735]: Starting module and watcher
Jan 23 04:43:11 np0005593295 ansible-async_wrapper.py[51735]: Start watching 51736 (300)
Jan 23 04:43:11 np0005593295 ansible-async_wrapper.py[51736]: Start module (51736)
Jan 23 04:43:11 np0005593295 ansible-async_wrapper.py[51732]: Return async_wrapper task started.
Jan 23 04:43:11 np0005593295 python3.9[51737]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 23 04:43:12 np0005593295 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 23 04:43:12 np0005593295 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 23 04:43:12 np0005593295 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 23 04:43:12 np0005593295 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 23 04:43:12 np0005593295 kernel: cfg80211: failed to load regulatory.db
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6062] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6082] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6581] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6584] audit: op="connection-add" uuid="8afb3294-ce59-4632-a155-fd329a29f291" name="br-ex-br" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6599] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6601] audit: op="connection-add" uuid="41bf286c-1626-4e6a-a1a5-43899fda1607" name="br-ex-port" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6611] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6613] audit: op="connection-add" uuid="d5093589-ee87-4ea7-8b7c-afde35c70b3c" name="eth1-port" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6624] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6626] audit: op="connection-add" uuid="0ba7ccec-a493-41fb-8252-9c804fe27a7d" name="vlan20-port" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6636] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6638] audit: op="connection-add" uuid="c0864541-42de-4ccd-af05-4c7ce87a2504" name="vlan21-port" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6647] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6649] audit: op="connection-add" uuid="e9160306-09c2-4c99-9b9a-e0cda1df1e76" name="vlan22-port" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6658] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6659] audit: op="connection-add" uuid="1050024d-b7d9-4ad3-83a2-892c0a4b608a" name="vlan23-port" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6676] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6690] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6691] audit: op="connection-add" uuid="3a35641e-c187-4912-8bb0-af4e928650e9" name="br-ex-if" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6719] audit: op="connection-update" uuid="8b069e9e-bd63-5e9d-bdd1-b5c43b66b918" name="ci-private-network" args="ipv4.addresses,ipv4.dns,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ipv6.routes,ipv6.addresses,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,ipv6.method,ovs-external-ids.data,connection.port-type,connection.controller,connection.master,connection.timestamp,connection.slave-type,ovs-interface.type" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6732] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6734] audit: op="connection-add" uuid="f54a3e8b-d986-4d33-b8e7-13d4b8990616" name="vlan20-if" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6747] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6749] audit: op="connection-add" uuid="59fcfee2-78b9-4dfc-8119-3ad866d888c1" name="vlan21-if" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6762] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6764] audit: op="connection-add" uuid="76b43057-748b-4c3d-94fd-248bcced744b" name="vlan22-if" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6778] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6780] audit: op="connection-add" uuid="aedd85d4-378a-4eb8-b14f-77b720fda39f" name="vlan23-if" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6791] audit: op="connection-delete" uuid="52728e87-b91d-3812-9239-09489880e5d3" name="Wired connection 1" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6801] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6805] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6812] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6817] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (8afb3294-ce59-4632-a155-fd329a29f291)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6818] audit: op="connection-activate" uuid="8afb3294-ce59-4632-a155-fd329a29f291" name="br-ex-br" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6820] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6822] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6827] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6830] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (41bf286c-1626-4e6a-a1a5-43899fda1607)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6832] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6833] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6837] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6841] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (d5093589-ee87-4ea7-8b7c-afde35c70b3c)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6843] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6845] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6849] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6852] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (0ba7ccec-a493-41fb-8252-9c804fe27a7d)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6854] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6855] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6859] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6863] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (c0864541-42de-4ccd-af05-4c7ce87a2504)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6865] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6866] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6870] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6874] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (e9160306-09c2-4c99-9b9a-e0cda1df1e76)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6876] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6878] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6882] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6885] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (1050024d-b7d9-4ad3-83a2-892c0a4b608a)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6887] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6890] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6892] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6898] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6900] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6903] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6908] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (3a35641e-c187-4912-8bb0-af4e928650e9)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6909] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6912] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6914] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6915] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6917] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6925] device (eth1): disconnecting for new activation request.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6926] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6930] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6932] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6933] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6936] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6937] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6941] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6944] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f54a3e8b-d986-4d33-b8e7-13d4b8990616)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6945] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6948] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6950] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6951] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6954] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6955] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6958] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6962] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (59fcfee2-78b9-4dfc-8119-3ad866d888c1)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6963] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6966] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6967] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6969] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6971] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6972] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6975] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6979] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (76b43057-748b-4c3d-94fd-248bcced744b)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6980] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6982] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6984] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6985] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6988] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <warn>  [1769161393.6990] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6992] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6997] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (aedd85d4-378a-4eb8-b14f-77b720fda39f)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.6998] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7001] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7003] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7004] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7006] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7017] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.autoconnect-priority" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7019] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7022] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7023] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7036] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7040] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7043] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7046] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7052] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 kernel: ovs-system: entered promiscuous mode
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7058] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7063] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 systemd-udevd[51743]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7067] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7069] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7073] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 kernel: Timeout policy base is empty
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7078] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7081] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7084] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7088] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7092] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7096] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7098] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7102] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7106] dhcp4 (eth0): canceled DHCP transaction
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7106] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7107] dhcp4 (eth0): state changed no lease
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7108] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7118] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7127] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51738 uid=0 result="fail" reason="Device is not activated"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7131] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7138] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7145] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7152] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7154] dhcp4 (eth0): state changed new lease, address=38.129.56.185
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7204] device (eth1): disconnecting for new activation request.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7205] audit: op="connection-activate" uuid="8b069e9e-bd63-5e9d-bdd1-b5c43b66b918" name="ci-private-network" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7251] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7253] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7374] device (eth1): Activation: starting connection 'ci-private-network' (8b069e9e-bd63-5e9d-bdd1-b5c43b66b918)
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7388] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7392] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7398] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7399] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7401] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7402] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7403] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7405] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7406] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7414] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 kernel: br-ex: entered promiscuous mode
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7432] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7438] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7442] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7445] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7448] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7451] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7454] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7458] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7461] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7464] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7468] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7471] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7474] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7478] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7484] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7490] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 kernel: vlan22: entered promiscuous mode
Jan 23 04:43:13 np0005593295 systemd-udevd[51742]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7540] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7542] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7545] device (eth1): Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7553] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7562] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7580] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7581] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7584] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 kernel: vlan23: entered promiscuous mode
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7653] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7669] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 kernel: vlan21: entered promiscuous mode
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7695] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7698] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7703] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 kernel: vlan20: entered promiscuous mode
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7772] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7792] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7794] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7812] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7859] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7863] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7865] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7869] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7875] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7881] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7886] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7898] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7940] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7942] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 04:43:13 np0005593295 NetworkManager[48964]: <info>  [1769161393.7948] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 04:43:14 np0005593295 NetworkManager[48964]: <info>  [1769161394.9216] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.0858] checkpoint[0x5622b23f4950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.0860] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51738 uid=0 result="success"
Jan 23 04:43:15 np0005593295 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 04:43:15 np0005593295 python3.9[52096]: ansible-ansible.legacy.async_status Invoked with jid=j829007530849.51732 mode=status _async_dir=/root/.ansible_async
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.3730] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51738 uid=0 result="success"
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.3741] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51738 uid=0 result="success"
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.5919] audit: op="networking-control" arg="global-dns-configuration" pid=51738 uid=0 result="success"
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.5949] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.5975] audit: op="networking-control" arg="global-dns-configuration" pid=51738 uid=0 result="success"
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.6013] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51738 uid=0 result="success"
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.7465] checkpoint[0x5622b23f4a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 23 04:43:15 np0005593295 NetworkManager[48964]: <info>  [1769161395.7471] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51738 uid=0 result="success"
Jan 23 04:43:15 np0005593295 ansible-async_wrapper.py[51736]: Module complete (51736)
Jan 23 04:43:16 np0005593295 ansible-async_wrapper.py[51735]: Done in kid B.
Jan 23 04:43:18 np0005593295 python3.9[52204]: ansible-ansible.legacy.async_status Invoked with jid=j829007530849.51732 mode=status _async_dir=/root/.ansible_async
Jan 23 04:43:19 np0005593295 python3.9[52304]: ansible-ansible.legacy.async_status Invoked with jid=j829007530849.51732 mode=cleanup _async_dir=/root/.ansible_async
Jan 23 04:43:20 np0005593295 python3.9[52456]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:20 np0005593295 python3.9[52579]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161399.7871294-923-126053006994693/.source.returncode _original_basename=.f4fcr_xd follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:21 np0005593295 python3.9[52731]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:22 np0005593295 python3.9[52854]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161401.1310925-972-253011611847754/.source.cfg _original_basename=.rej6hpns follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:23 np0005593295 python3.9[53007]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:43:23 np0005593295 systemd[1]: Reloading Network Manager...
Jan 23 04:43:23 np0005593295 NetworkManager[48964]: <info>  [1769161403.0936] audit: op="reload" arg="0" pid=53011 uid=0 result="success"
Jan 23 04:43:23 np0005593295 NetworkManager[48964]: <info>  [1769161403.0944] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 23 04:43:23 np0005593295 systemd[1]: Reloaded Network Manager.
Jan 23 04:43:23 np0005593295 systemd-logind[786]: Session 11 logged out. Waiting for processes to exit.
Jan 23 04:43:23 np0005593295 systemd[1]: session-11.scope: Deactivated successfully.
Jan 23 04:43:23 np0005593295 systemd[1]: session-11.scope: Consumed 53.740s CPU time.
Jan 23 04:43:23 np0005593295 systemd-logind[786]: Removed session 11.
Jan 23 04:43:29 np0005593295 systemd-logind[786]: New session 12 of user zuul.
Jan 23 04:43:29 np0005593295 systemd[1]: Started Session 12 of User zuul.
Jan 23 04:43:31 np0005593295 python3.9[53195]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:43:32 np0005593295 python3.9[53350]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:43:33 np0005593295 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 04:43:33 np0005593295 python3.9[53543]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:43:33 np0005593295 systemd[1]: session-12.scope: Deactivated successfully.
Jan 23 04:43:33 np0005593295 systemd[1]: session-12.scope: Consumed 2.277s CPU time.
Jan 23 04:43:33 np0005593295 systemd-logind[786]: Session 12 logged out. Waiting for processes to exit.
Jan 23 04:43:33 np0005593295 systemd-logind[786]: Removed session 12.
Jan 23 04:43:39 np0005593295 systemd-logind[786]: New session 13 of user zuul.
Jan 23 04:43:39 np0005593295 systemd[1]: Started Session 13 of User zuul.
Jan 23 04:43:40 np0005593295 python3.9[53725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:43:41 np0005593295 python3.9[53879]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:43:42 np0005593295 python3.9[54036]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:43:43 np0005593295 python3.9[54120]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:43:46 np0005593295 python3.9[54274]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:43:47 np0005593295 python3.9[54469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:48 np0005593295 python3.9[54621]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:43:48 np0005593295 systemd[1]: var-lib-containers-storage-overlay-compat2262428241-merged.mount: Deactivated successfully.
Jan 23 04:43:48 np0005593295 podman[54622]: 2026-01-23 09:43:48.425324946 +0000 UTC m=+0.053176737 system refresh
Jan 23 04:43:49 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:43:49 np0005593295 python3.9[54784]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:50 np0005593295 python3.9[54907]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161428.754381-194-95758373487693/.source.json follow=False _original_basename=podman_network_config.j2 checksum=1b0be18864a1e74e2095b155999887790d126e9d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:43:51 np0005593295 python3.9[55059]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:43:51 np0005593295 python3.9[55182]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161430.5504313-240-19061006517870/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:52 np0005593295 python3.9[55334]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:53 np0005593295 python3.9[55486]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:53 np0005593295 python3.9[55638]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:54 np0005593295 python3.9[55790]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:43:55 np0005593295 python3.9[55942]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:43:57 np0005593295 python3.9[56095]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:43:58 np0005593295 python3.9[56249]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:43:59 np0005593295 python3.9[56401]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:44:00 np0005593295 python3.9[56553]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:44:01 np0005593295 python3.9[56706]: ansible-service_facts Invoked
Jan 23 04:44:01 np0005593295 network[56723]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:44:01 np0005593295 network[56724]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:44:01 np0005593295 network[56725]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:44:06 np0005593295 python3.9[57177]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:44:10 np0005593295 python3.9[57330]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 04:44:12 np0005593295 python3.9[57482]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:12 np0005593295 python3.9[57607]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161451.9298615-673-77630030882417/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:13 np0005593295 python3.9[57761]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:14 np0005593295 python3.9[57886]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161453.290572-717-179379001969501/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:16 np0005593295 python3.9[58040]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:18 np0005593295 python3.9[58194]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:44:19 np0005593295 python3.9[58278]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:44:20 np0005593295 python3.9[58432]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:44:21 np0005593295 python3.9[58516]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:44:21 np0005593295 chronyd[794]: chronyd exiting
Jan 23 04:44:21 np0005593295 systemd[1]: Stopping NTP client/server...
Jan 23 04:44:21 np0005593295 systemd[1]: chronyd.service: Deactivated successfully.
Jan 23 04:44:21 np0005593295 systemd[1]: Stopped NTP client/server.
Jan 23 04:44:21 np0005593295 systemd[1]: Starting NTP client/server...
Jan 23 04:44:21 np0005593295 chronyd[58525]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 04:44:21 np0005593295 chronyd[58525]: Frequency -23.658 +/- 0.241 ppm read from /var/lib/chrony/drift
Jan 23 04:44:21 np0005593295 chronyd[58525]: Loaded seccomp filter (level 2)
Jan 23 04:44:21 np0005593295 systemd[1]: Started NTP client/server.
Jan 23 04:44:22 np0005593295 systemd[1]: session-13.scope: Deactivated successfully.
Jan 23 04:44:22 np0005593295 systemd[1]: session-13.scope: Consumed 25.790s CPU time.
Jan 23 04:44:22 np0005593295 systemd-logind[786]: Session 13 logged out. Waiting for processes to exit.
Jan 23 04:44:22 np0005593295 systemd-logind[786]: Removed session 13.
Jan 23 04:44:29 np0005593295 systemd-logind[786]: New session 14 of user zuul.
Jan 23 04:44:29 np0005593295 systemd[1]: Started Session 14 of User zuul.
Jan 23 04:44:29 np0005593295 python3.9[58706]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:30 np0005593295 python3.9[58858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:31 np0005593295 python3.9[58981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161470.1658196-59-272084309555168/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:31 np0005593295 systemd[1]: session-14.scope: Deactivated successfully.
Jan 23 04:44:31 np0005593295 systemd[1]: session-14.scope: Consumed 1.566s CPU time.
Jan 23 04:44:31 np0005593295 systemd-logind[786]: Session 14 logged out. Waiting for processes to exit.
Jan 23 04:44:31 np0005593295 systemd-logind[786]: Removed session 14.
Jan 23 04:44:37 np0005593295 systemd-logind[786]: New session 15 of user zuul.
Jan 23 04:44:37 np0005593295 systemd[1]: Started Session 15 of User zuul.
Jan 23 04:44:38 np0005593295 python3.9[59159]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:44:39 np0005593295 python3.9[59315]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:40 np0005593295 python3.9[59490]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:41 np0005593295 python3.9[59613]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769161480.1717734-80-277237372158404/.source.json _original_basename=.maafkbe6 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:42 np0005593295 python3.9[59765]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:42 np0005593295 python3.9[59888]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161482.0350761-149-193192911953963/.source _original_basename=.t83b5tsx follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:43 np0005593295 python3.9[60040]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:44:44 np0005593295 python3.9[60192]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:45 np0005593295 python3.9[60315]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161484.1471531-222-172212484964514/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:44:45 np0005593295 python3.9[60467]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:46 np0005593295 python3.9[60590]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161485.275144-222-125022384232886/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:44:47 np0005593295 python3.9[60742]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:48 np0005593295 python3.9[60894]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:48 np0005593295 python3.9[61017]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161487.7556248-332-21926245423317/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:49 np0005593295 python3.9[61169]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:50 np0005593295 python3.9[61292]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161488.9803607-378-93908684126056/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:51 np0005593295 python3.9[61444]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:44:51 np0005593295 systemd[1]: Reloading.
Jan 23 04:44:51 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:44:51 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:44:51 np0005593295 systemd[1]: Reloading.
Jan 23 04:44:51 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:44:51 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:44:51 np0005593295 systemd[1]: Starting EDPM Container Shutdown...
Jan 23 04:44:51 np0005593295 systemd[1]: Finished EDPM Container Shutdown.
Jan 23 04:44:52 np0005593295 python3.9[61672]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:52 np0005593295 python3.9[61795]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161491.9617503-447-196257357072030/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:53 np0005593295 python3.9[61947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:44:54 np0005593295 python3.9[62070]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161493.2706807-492-1817716569185/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:44:55 np0005593295 python3.9[62222]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:44:55 np0005593295 systemd[1]: Reloading.
Jan 23 04:44:55 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:44:55 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:44:55 np0005593295 systemd[1]: Reloading.
Jan 23 04:44:55 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:44:55 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:44:55 np0005593295 systemd[1]: Starting Create netns directory...
Jan 23 04:44:55 np0005593295 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:44:55 np0005593295 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:44:55 np0005593295 systemd[1]: Finished Create netns directory.
Jan 23 04:44:56 np0005593295 python3.9[62449]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:44:56 np0005593295 network[62466]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:44:56 np0005593295 network[62467]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:44:56 np0005593295 network[62468]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:45:01 np0005593295 python3.9[62730]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:45:01 np0005593295 systemd[1]: Reloading.
Jan 23 04:45:01 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:45:01 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:45:01 np0005593295 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 23 04:45:02 np0005593295 iptables.init[62770]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 23 04:45:02 np0005593295 iptables.init[62770]: iptables: Flushing firewall rules: [  OK  ]
Jan 23 04:45:02 np0005593295 systemd[1]: iptables.service: Deactivated successfully.
Jan 23 04:45:02 np0005593295 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 23 04:45:02 np0005593295 python3.9[62966]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:45:03 np0005593295 python3.9[63120]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:45:03 np0005593295 systemd[1]: Reloading.
Jan 23 04:45:04 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:45:04 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:45:04 np0005593295 systemd[1]: Starting Netfilter Tables...
Jan 23 04:45:04 np0005593295 systemd[1]: Finished Netfilter Tables.
Jan 23 04:45:05 np0005593295 python3.9[63311]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:06 np0005593295 python3.9[63464]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:07 np0005593295 python3.9[63589]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161506.1639-699-262750373175026/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:08 np0005593295 python3.9[63742]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:45:08 np0005593295 systemd[1]: Reloading OpenSSH server daemon...
Jan 23 04:45:08 np0005593295 systemd[1]: Reloaded OpenSSH server daemon.
Jan 23 04:45:08 np0005593295 python3.9[63898]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:09 np0005593295 python3.9[64050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:10 np0005593295 python3.9[64173]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161509.030427-792-221927825154635/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:11 np0005593295 python3.9[64325]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 04:45:11 np0005593295 systemd[1]: Starting Time & Date Service...
Jan 23 04:45:11 np0005593295 systemd[1]: Started Time & Date Service.
Jan 23 04:45:12 np0005593295 python3.9[64481]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:13 np0005593295 python3.9[64633]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:14 np0005593295 python3.9[64756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161513.166487-897-241666304338393/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:14 np0005593295 python3.9[64908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:15 np0005593295 python3.9[65031]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161514.383783-942-133691894704269/.source.yaml _original_basename=.p08ujq7o follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:16 np0005593295 python3.9[65183]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:16 np0005593295 python3.9[65306]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161515.591409-987-162579262437420/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:17 np0005593295 python3.9[65458]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:18 np0005593295 python3.9[65611]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:19 np0005593295 python3[65764]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:45:19 np0005593295 python3.9[65916]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:20 np0005593295 python3.9[66039]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161519.5056405-1104-257627253122521/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:21 np0005593295 python3.9[66191]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:21 np0005593295 python3.9[66314]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161520.7699082-1149-45393942645795/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:22 np0005593295 python3.9[66466]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:23 np0005593295 python3.9[66589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161522.029869-1194-198351354581833/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:23 np0005593295 python3.9[66741]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:24 np0005593295 python3.9[66864]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161523.325557-1239-266791836193569/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:25 np0005593295 python3.9[67016]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:45:25 np0005593295 python3.9[67139]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161524.6178775-1284-39467673817560/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:26 np0005593295 python3.9[67291]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:27 np0005593295 python3.9[67443]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:28 np0005593295 python3.9[67602]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:29 np0005593295 python3.9[67755]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:29 np0005593295 python3.9[67907]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:30 np0005593295 python3.9[68059]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:45:31 np0005593295 python3.9[68212]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:45:31 np0005593295 systemd[1]: session-15.scope: Deactivated successfully.
Jan 23 04:45:31 np0005593295 systemd[1]: session-15.scope: Consumed 34.191s CPU time.
Jan 23 04:45:31 np0005593295 systemd-logind[786]: Session 15 logged out. Waiting for processes to exit.
Jan 23 04:45:31 np0005593295 systemd-logind[786]: Removed session 15.
Jan 23 04:45:37 np0005593295 systemd-logind[786]: New session 16 of user zuul.
Jan 23 04:45:37 np0005593295 systemd[1]: Started Session 16 of User zuul.
Jan 23 04:45:38 np0005593295 python3.9[68393]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 04:45:38 np0005593295 python3.9[68545]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:45:40 np0005593295 python3.9[68697]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:45:41 np0005593295 python3.9[68849]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+cj2so8SS29oYZ1K+7e02qi6fVkGXJzGMkIN9mgJPLCBtQ6vpBYEObTZZXuMIHhdiMUAp6RDjs11OXDkAB9R7e2ncjMKn7J2EHbmceT7rNq9L0w+QaLKFxl+xdJQ9QtO9ioNgJFXXQZt/IOeE8S4I5yhEM5jn+YEW0LPbp99Wz1d1Ob4GI1t0hCEv/4ayC3nRIXkuIhl7mrV0s22F8NE8f0hZZKaw1u8xmmpbD8ZVBsC6cxWE3kIQBmHu8q9tylaZjLsjGxBDUF9ko3bxeppvLPDMem89VLQCWbgmOHl5ZIPsyNglusTIBUp8uA7g+Agz1uMojClMHnsZl68WjbCAVcRA9y/UgXphGyEYZCUJMv8CjYKzxriyHALZl6YFSyC5ELlEAxL8fyTwtXhQ1+e/lI9Ak3n4suC6JyH0NQ27MPIf7riyUFJLw9lZaDerZOkvI7/Y2PfRvdfyZ57g/xgGeLY0Ch30SFVC04lNXIpsOWbLBOg0BMP9ZiciAYAF9Yc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIreWuVcekgp7kF5pU+4TIKLHZyhuqd4Ly312ExEA5EG#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJWfXOTsTXqDhdGhW7VcUXsYqCS7TzCPyaa9/dA9e0xKjnni1/GRM8FdYXWYbGsNnBQFWk3/pXD6sj3jKzK34AM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWbrXZxuAw0n/xJmOvWW/Qbg53ya2CuJKzcHA+OvDpHLHGxkEuiUhwKvqUbfSTzn0o1M00OYITJIvZVINGRtQC7hGvBPWLVBON097mcmnju857I72U3dGdvGhnEUHyrglCV+xSkafQTTlnY9B59EKImUs/kiwRy3cYDWkCgthJgiPA4QSw6WrzaqpY2ET+7n+yY31EOagGA3ufW43qFbHX4diFuXpS1I1PLvvA4KINlMlsFcyR29j4nQk/vb5hMpLmBOlfVH16CXZC98a0ltp9ib7F3e1Wjdogj92kxwfQMYIeQEBp11Tc/PY5U90J51oyk8xYOKfsP3+r9yczmfRDjwR3+tzUMKyZYAsKQVcOGQC7x9sEXg3mBeXRVrlIVZFMuNVcYq4CY40fDIybcI25GxgRbQR7ZUWODG1SL7RF02Z+LQB6APXkzxdQUWLWPryj/EtOgnHQ1I0+BJTWrqGkKbSj41jhRTfS+MZvRXAJ+fNyZFhpkHo54DrCii4cbyM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGRPkwTcFVg/dIKRq29iWBfkoVFqIQ1pXOCPxfcGWRFF#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGf/hJ2dg/PRwojw63FLyKqua+ChKP+2bc7Eb0p70H6ve1elFVeY8lVRXx33JWc2m/XfgSWPNcUs9zBG8QcFVak=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA/6JnQZ3CFC7xgv4DrvdZizVbVnsolKcWkvqzGu1hFHGmOEb7ehbxGPHBnp2N9iRf13H12EI0qNI6A2f44V0oXE3SP+fpJ6PVYQRQpKqTEiweqZaHEyYE2FnKy0HDQisg5hwr1egYLjGXChdkyqWSokL1LqaCyD2+EcOzUvC/GuVQ7eQnQBIGBpYAnNzS/64KKOZ0+0soOPJGxVCma6JN/2GcCunX6j3HmkOOQeuEFETXfUPHh1ylu2+3yINl34ERJN5YwgR/S+BKENOsJTu5XkYTCvc90CuvfkoF9K5Y2yE5nKwZaSf7n2SbUPil2Zph4l7opsd5IKxi6k2mVzw/CO2NHr136BZ06+sKXytDgorWqWzqnci8zfxeYF3D7q7AXD+IDVMP5T6op93oS2enAQFHG1vTLB0otQqnxUgNANbJkrKgXAS8G8I1m2sPz+qOFuuZa2/nqhzrd6/DEur5VoW6n9c/OcrbfapLEzD1jQDmsQI7oZkT++dt3Ogb3Vk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIII1sLqY7Nqi1A3CKXLokfn1vrns/lK1gUkDNSlbek2o#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9QZXHUsthFMKA5Si4Htl7MIwK0G4VAltQgbo39JJHrgD7h27U1jbnuJQ1S2bBX8FMSkqf5TPmM7Gr9QOATO+4=#012 create=True mode=0644 path=/tmp/ansible.tqrg3z1f state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:41 np0005593295 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 04:45:42 np0005593295 python3.9[69003]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.tqrg3z1f' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:42 np0005593295 python3.9[69157]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.tqrg3z1f state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:43 np0005593295 systemd[1]: session-16.scope: Deactivated successfully.
Jan 23 04:45:43 np0005593295 systemd[1]: session-16.scope: Consumed 3.243s CPU time.
Jan 23 04:45:43 np0005593295 systemd-logind[786]: Session 16 logged out. Waiting for processes to exit.
Jan 23 04:45:43 np0005593295 systemd-logind[786]: Removed session 16.
Jan 23 04:45:49 np0005593295 systemd-logind[786]: New session 17 of user zuul.
Jan 23 04:45:49 np0005593295 systemd[1]: Started Session 17 of User zuul.
Jan 23 04:45:50 np0005593295 python3.9[69335]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:45:52 np0005593295 python3.9[69491]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 04:45:52 np0005593295 python3.9[69645]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:45:54 np0005593295 python3.9[69798]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:54 np0005593295 python3.9[69951]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:45:55 np0005593295 python3.9[70105]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:45:56 np0005593295 python3.9[70260]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:45:57 np0005593295 systemd[1]: session-17.scope: Deactivated successfully.
Jan 23 04:45:57 np0005593295 systemd[1]: session-17.scope: Consumed 4.254s CPU time.
Jan 23 04:45:57 np0005593295 systemd-logind[786]: Session 17 logged out. Waiting for processes to exit.
Jan 23 04:45:57 np0005593295 systemd-logind[786]: Removed session 17.
Jan 23 04:46:02 np0005593295 systemd-logind[786]: New session 18 of user zuul.
Jan 23 04:46:02 np0005593295 systemd[1]: Started Session 18 of User zuul.
Jan 23 04:46:03 np0005593295 python3.9[70438]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:46:04 np0005593295 python3.9[70594]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:46:06 np0005593295 python3.9[70678]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:46:08 np0005593295 python3.9[70829]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:46:10 np0005593295 python3.9[70980]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:46:10 np0005593295 python3.9[71130]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:46:10 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:46:10 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:46:11 np0005593295 python3.9[71281]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:46:12 np0005593295 systemd-logind[786]: Session 18 logged out. Waiting for processes to exit.
Jan 23 04:46:12 np0005593295 systemd[1]: session-18.scope: Deactivated successfully.
Jan 23 04:46:12 np0005593295 systemd[1]: session-18.scope: Consumed 5.877s CPU time.
Jan 23 04:46:12 np0005593295 systemd-logind[786]: Removed session 18.
Jan 23 04:46:20 np0005593295 systemd-logind[786]: New session 19 of user zuul.
Jan 23 04:46:20 np0005593295 systemd[1]: Started Session 19 of User zuul.
Jan 23 04:46:26 np0005593295 python3[72047]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:46:28 np0005593295 python3[72142]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 04:46:30 np0005593295 python3[72169]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 04:46:30 np0005593295 python3[72195]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:46:30 np0005593295 kernel: loop: module loaded
Jan 23 04:46:30 np0005593295 kernel: loop3: detected capacity change from 0 to 41943040
Jan 23 04:46:31 np0005593295 python3[72230]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:46:31 np0005593295 lvm[72233]: PV /dev/loop3 not used.
Jan 23 04:46:31 np0005593295 lvm[72242]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:46:31 np0005593295 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 23 04:46:31 np0005593295 lvm[72244]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 23 04:46:31 np0005593295 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 23 04:46:31 np0005593295 chronyd[58525]: Selected source 167.160.187.179 (pool.ntp.org)
Jan 23 04:46:32 np0005593295 python3[72322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 04:46:32 np0005593295 python3[72395]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769161591.9139266-37005-66242161926865/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:46:33 np0005593295 python3[72445]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:46:33 np0005593295 systemd[1]: Reloading.
Jan 23 04:46:33 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:46:33 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:46:33 np0005593295 systemd[1]: Starting Ceph OSD losetup...
Jan 23 04:46:33 np0005593295 bash[72486]: /dev/loop3: [64513]:4328453 (/var/lib/ceph-osd-0.img)
Jan 23 04:46:33 np0005593295 systemd[1]: Finished Ceph OSD losetup.
Jan 23 04:46:33 np0005593295 lvm[72487]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:46:33 np0005593295 lvm[72487]: VG ceph_vg0 finished
Jan 23 04:46:36 np0005593295 python3[72511]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:48:49 np0005593295 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 04:48:49 np0005593295 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 04:48:49 np0005593295 systemd-logind[786]: New session 20 of user ceph-admin.
Jan 23 04:48:49 np0005593295 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 04:48:49 np0005593295 systemd[1]: Starting User Manager for UID 42477...
Jan 23 04:48:49 np0005593295 systemd[72559]: Queued start job for default target Main User Target.
Jan 23 04:48:49 np0005593295 systemd-logind[786]: New session 22 of user ceph-admin.
Jan 23 04:48:49 np0005593295 systemd[72559]: Created slice User Application Slice.
Jan 23 04:48:49 np0005593295 systemd[72559]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:48:49 np0005593295 systemd[72559]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:48:49 np0005593295 systemd[72559]: Reached target Paths.
Jan 23 04:48:49 np0005593295 systemd[72559]: Reached target Timers.
Jan 23 04:48:49 np0005593295 systemd[72559]: Starting D-Bus User Message Bus Socket...
Jan 23 04:48:49 np0005593295 systemd[72559]: Starting Create User's Volatile Files and Directories...
Jan 23 04:48:49 np0005593295 systemd[72559]: Finished Create User's Volatile Files and Directories.
Jan 23 04:48:49 np0005593295 systemd[72559]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:48:49 np0005593295 systemd[72559]: Reached target Sockets.
Jan 23 04:48:49 np0005593295 systemd[72559]: Reached target Basic System.
Jan 23 04:48:49 np0005593295 systemd[72559]: Reached target Main User Target.
Jan 23 04:48:49 np0005593295 systemd[72559]: Startup finished in 124ms.
Jan 23 04:48:49 np0005593295 systemd[1]: Started User Manager for UID 42477.
Jan 23 04:48:49 np0005593295 systemd[1]: Started Session 20 of User ceph-admin.
Jan 23 04:48:49 np0005593295 systemd[1]: Started Session 22 of User ceph-admin.
Jan 23 04:48:49 np0005593295 systemd-logind[786]: New session 23 of user ceph-admin.
Jan 23 04:48:49 np0005593295 systemd[1]: Started Session 23 of User ceph-admin.
Jan 23 04:48:50 np0005593295 systemd-logind[786]: New session 24 of user ceph-admin.
Jan 23 04:48:50 np0005593295 systemd[1]: Started Session 24 of User ceph-admin.
Jan 23 04:48:50 np0005593295 systemd-logind[786]: New session 25 of user ceph-admin.
Jan 23 04:48:50 np0005593295 systemd[1]: Started Session 25 of User ceph-admin.
Jan 23 04:48:50 np0005593295 systemd-logind[786]: New session 26 of user ceph-admin.
Jan 23 04:48:50 np0005593295 systemd[1]: Started Session 26 of User ceph-admin.
Jan 23 04:48:51 np0005593295 systemd-logind[786]: New session 27 of user ceph-admin.
Jan 23 04:48:51 np0005593295 systemd[1]: Started Session 27 of User ceph-admin.
Jan 23 04:48:51 np0005593295 systemd-logind[786]: New session 28 of user ceph-admin.
Jan 23 04:48:51 np0005593295 systemd[1]: Started Session 28 of User ceph-admin.
Jan 23 04:48:51 np0005593295 systemd-logind[786]: New session 29 of user ceph-admin.
Jan 23 04:48:51 np0005593295 systemd[1]: Started Session 29 of User ceph-admin.
Jan 23 04:48:51 np0005593295 systemd-logind[786]: New session 30 of user ceph-admin.
Jan 23 04:48:52 np0005593295 systemd[1]: Started Session 30 of User ceph-admin.
Jan 23 04:48:53 np0005593295 systemd-logind[786]: New session 31 of user ceph-admin.
Jan 23 04:48:53 np0005593295 systemd[1]: Started Session 31 of User ceph-admin.
Jan 23 04:48:53 np0005593295 systemd-logind[786]: New session 32 of user ceph-admin.
Jan 23 04:48:53 np0005593295 systemd[1]: Started Session 32 of User ceph-admin.
Jan 23 04:48:53 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:53 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:54 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:54 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:54 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:54 np0005593295 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73155 (sysctl)
Jan 23 04:49:54 np0005593295 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 23 04:49:54 np0005593295 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 23 04:49:55 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:55 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:55 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:49:58 np0005593295 systemd[1]: var-lib-containers-storage-overlay-compat2815177462-lower\x2dmapped.mount: Deactivated successfully.
Jan 23 04:50:33 np0005593295 podman[73332]: 2026-01-23 09:50:33.868790058 +0000 UTC m=+38.084395360 container create e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 04:50:33 np0005593295 podman[73332]: 2026-01-23 09:50:33.853240388 +0000 UTC m=+38.068845710 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:33 np0005593295 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 23 04:50:33 np0005593295 systemd[1]: Started libpod-conmon-e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de.scope.
Jan 23 04:50:33 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:50:33 np0005593295 podman[73332]: 2026-01-23 09:50:33.97914756 +0000 UTC m=+38.194752882 container init e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 04:50:33 np0005593295 podman[73332]: 2026-01-23 09:50:33.98780466 +0000 UTC m=+38.203409982 container start e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:50:33 np0005593295 podman[73332]: 2026-01-23 09:50:33.99168024 +0000 UTC m=+38.207285592 container attach e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:50:33 np0005593295 hungry_hermann[73400]: 167 167
Jan 23 04:50:33 np0005593295 systemd[1]: libpod-e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de.scope: Deactivated successfully.
Jan 23 04:50:33 np0005593295 podman[73332]: 2026-01-23 09:50:33.9951296 +0000 UTC m=+38.210734912 container died e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Jan 23 04:50:34 np0005593295 systemd[1]: var-lib-containers-storage-overlay-30ee5341d832c697ce3d7f154ce66edfcc7e7c0b309659f122d027af60b99079-merged.mount: Deactivated successfully.
Jan 23 04:50:34 np0005593295 podman[73332]: 2026-01-23 09:50:34.042870864 +0000 UTC m=+38.258476166 container remove e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hungry_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:50:34 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:50:34 np0005593295 systemd[1]: libpod-conmon-e46c5f66cf11247be75b9fc4be34bb5cb9a087e014c010fbe1dfe03c8217d7de.scope: Deactivated successfully.
Jan 23 04:50:34 np0005593295 podman[73424]: 2026-01-23 09:50:34.205366881 +0000 UTC m=+0.048402000 container create d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 04:50:34 np0005593295 systemd[1]: Started libpod-conmon-d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359.scope.
Jan 23 04:50:34 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:50:34 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4adaf7497dcf8b5afe505ff619eee9241ce1e42c7af5be057c7e964eef7d6d49/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:34 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4adaf7497dcf8b5afe505ff619eee9241ce1e42c7af5be057c7e964eef7d6d49/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:34 np0005593295 podman[73424]: 2026-01-23 09:50:34.185239175 +0000 UTC m=+0.028274314 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:34 np0005593295 podman[73424]: 2026-01-23 09:50:34.293197761 +0000 UTC m=+0.136232880 container init d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:50:34 np0005593295 podman[73424]: 2026-01-23 09:50:34.303404158 +0000 UTC m=+0.146439267 container start d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 04:50:34 np0005593295 podman[73424]: 2026-01-23 09:50:34.308019874 +0000 UTC m=+0.151054993 container attach d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]: [
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:    {
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "available": false,
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "being_replaced": false,
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "ceph_device_lvm": false,
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "lsm_data": {},
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "lvs": [],
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "path": "/dev/sr0",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "rejected_reasons": [
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "Has a FileSystem",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "Insufficient space (<5GB)"
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        ],
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        "sys_api": {
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "actuators": null,
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "device_nodes": [
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:                "sr0"
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            ],
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "devname": "sr0",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "human_readable_size": "482.00 KB",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "id_bus": "ata",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "model": "QEMU DVD-ROM",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "nr_requests": "2",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "parent": "/dev/sr0",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "partitions": {},
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "path": "/dev/sr0",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "removable": "1",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "rev": "2.5+",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "ro": "0",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "rotational": "1",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "sas_address": "",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "sas_device_handle": "",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "scheduler_mode": "mq-deadline",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "sectors": 0,
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "sectorsize": "2048",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "size": 493568.0,
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "support_discard": "2048",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "type": "disk",
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:            "vendor": "QEMU"
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:        }
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]:    }
Jan 23 04:50:35 np0005593295 vigilant_solomon[73440]: ]
Jan 23 04:50:35 np0005593295 systemd[1]: libpod-d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359.scope: Deactivated successfully.
Jan 23 04:50:35 np0005593295 podman[73424]: 2026-01-23 09:50:35.079521892 +0000 UTC m=+0.922557041 container died d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Jan 23 04:50:35 np0005593295 systemd[1]: var-lib-containers-storage-overlay-4adaf7497dcf8b5afe505ff619eee9241ce1e42c7af5be057c7e964eef7d6d49-merged.mount: Deactivated successfully.
Jan 23 04:50:35 np0005593295 podman[73424]: 2026-01-23 09:50:35.120900119 +0000 UTC m=+0.963935238 container remove d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_solomon, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 23 04:50:35 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:50:35 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:50:35 np0005593295 systemd[1]: libpod-conmon-d25dbb6340a3047dfe5d8f4772aa594b733d66e3f295c8f12c0768ac6520e359.scope: Deactivated successfully.
Jan 23 04:50:37 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:50:37 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:50:37 np0005593295 podman[75427]: 2026-01-23 09:50:37.994391355 +0000 UTC m=+0.039229438 container create 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 23 04:50:38 np0005593295 systemd[1]: Started libpod-conmon-8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c.scope.
Jan 23 04:50:38 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:50:38 np0005593295 podman[75427]: 2026-01-23 09:50:38.055816766 +0000 UTC m=+0.100654859 container init 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 04:50:38 np0005593295 podman[75427]: 2026-01-23 09:50:38.061128239 +0000 UTC m=+0.105966322 container start 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 23 04:50:38 np0005593295 cranky_roentgen[75444]: 167 167
Jan 23 04:50:38 np0005593295 systemd[1]: libpod-8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c.scope: Deactivated successfully.
Jan 23 04:50:38 np0005593295 podman[75427]: 2026-01-23 09:50:38.070163457 +0000 UTC m=+0.115001540 container attach 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 04:50:38 np0005593295 podman[75427]: 2026-01-23 09:50:38.070788451 +0000 UTC m=+0.115626574 container died 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:50:38 np0005593295 podman[75427]: 2026-01-23 09:50:37.977363262 +0000 UTC m=+0.022201375 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:38 np0005593295 podman[75427]: 2026-01-23 09:50:38.108852672 +0000 UTC m=+0.153690755 container remove 8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_roentgen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:50:38 np0005593295 systemd[1]: libpod-conmon-8e92715603959a0beed35407b13bd59ae771800bd98b1c2683803b6deaeba39c.scope: Deactivated successfully.
Jan 23 04:50:38 np0005593295 podman[75460]: 2026-01-23 09:50:38.187092381 +0000 UTC m=+0.048604335 container create 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:50:38 np0005593295 systemd[1]: Started libpod-conmon-1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c.scope.
Jan 23 04:50:38 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:50:38 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e200e91b5eec93c149ba7f4c62968dbeb981a25b1b6485af560ee8141f136c90/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:38 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e200e91b5eec93c149ba7f4c62968dbeb981a25b1b6485af560ee8141f136c90/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:38 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e200e91b5eec93c149ba7f4c62968dbeb981a25b1b6485af560ee8141f136c90/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:38 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e200e91b5eec93c149ba7f4c62968dbeb981a25b1b6485af560ee8141f136c90/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:38 np0005593295 podman[75460]: 2026-01-23 09:50:38.248025819 +0000 UTC m=+0.109537773 container init 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:50:38 np0005593295 podman[75460]: 2026-01-23 09:50:38.255565283 +0000 UTC m=+0.117077237 container start 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Jan 23 04:50:38 np0005593295 podman[75460]: 2026-01-23 09:50:38.259856253 +0000 UTC m=+0.121368407 container attach 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 23 04:50:38 np0005593295 podman[75460]: 2026-01-23 09:50:38.167912677 +0000 UTC m=+0.029424661 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:38 np0005593295 systemd[1]: libpod-1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c.scope: Deactivated successfully.
Jan 23 04:50:38 np0005593295 podman[75460]: 2026-01-23 09:50:38.354075661 +0000 UTC m=+0.215587615 container died 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:50:38 np0005593295 podman[75460]: 2026-01-23 09:50:38.402568302 +0000 UTC m=+0.264080256 container remove 1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jepsen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True)
Jan 23 04:50:38 np0005593295 systemd[1]: libpod-conmon-1d1cab7cce0d31c65f00e07118afae557459716da2437d2034b12bfd7dedb43c.scope: Deactivated successfully.
Jan 23 04:50:38 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:38 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:38 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:38 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:50:38 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:38 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:38 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:38 np0005593295 systemd[1]: Reached target All Ceph clusters and services.
Jan 23 04:50:38 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:39 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:39 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:39 np0005593295 systemd[1]: Reached target Ceph cluster f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:50:39 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:39 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:39 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:39 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:39 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:39 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:39 np0005593295 systemd[1]: Created slice Slice /system/ceph-f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:50:39 np0005593295 systemd[1]: Reached target System Time Set.
Jan 23 04:50:39 np0005593295 systemd[1]: Reached target System Time Synchronized.
Jan 23 04:50:39 np0005593295 systemd[1]: Starting Ceph mon.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:50:39 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:50:39 np0005593295 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:50:39 np0005593295 podman[75752]: 2026-01-23 09:50:39.976342718 +0000 UTC m=+0.047558140 container create 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 04:50:40 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af5d6b8d37d6524135efc3a43b1cfcd035993efbedfefadd35921da09b6ddc6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:40 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af5d6b8d37d6524135efc3a43b1cfcd035993efbedfefadd35921da09b6ddc6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:40 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af5d6b8d37d6524135efc3a43b1cfcd035993efbedfefadd35921da09b6ddc6/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:40 np0005593295 podman[75752]: 2026-01-23 09:50:40.045522448 +0000 UTC m=+0.116737970 container init 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:50:40 np0005593295 podman[75752]: 2026-01-23 09:50:39.956588191 +0000 UTC m=+0.027803643 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:40 np0005593295 podman[75752]: 2026-01-23 09:50:40.052214463 +0000 UTC m=+0.123429935 container start 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 04:50:40 np0005593295 bash[75752]: 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4
Jan 23 04:50:40 np0005593295 systemd[1]: Started Ceph mon.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: pidfile_write: ignore empty --pid-file
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: load: jerasure load: lrc 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Git sha 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: DB SUMMARY
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: DB Session ID:  17IZ7DW7X4LNV3P33NJD
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                                     Options.env: 0x55c650375c20
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                                Options.info_log: 0x55c65134da20
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                                 Options.wal_dir: 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                    Options.write_buffer_manager: 0x55c651351900
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                               Options.row_cache: None
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                              Options.wal_filter: None
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.wal_compression: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.max_background_jobs: 2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.max_total_wal_size: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:       Options.compaction_readahead_size: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Compression algorithms supported:
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: #011kZSTD supported: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:           Options.merge_operator: 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:        Options.compaction_filter: None
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c65134d6a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55c6513709b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:        Options.write_buffer_size: 33554432
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:  Options.max_write_buffer_number: 2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:          Options.compression: NoCompression
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.num_levels: 7
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: dbf9ba81-81fe-4d1e-9307-233133587890
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161840104742, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161840106884, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161840107013, "job": 1, "event": "recovery_finished"}
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55c651372e00
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: DB pointer 0x55c651382000
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(???) e0 preinit fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).mds e1 new map
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012btime 2026-01-23T09:47:38:565964+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 1 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 1 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 1 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 1 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 1 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e30 e30: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 e31: 2 total, 2 up, 2 in
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 3314933000852226048, adjusting msgr requires
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/1144026165' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/1803776421' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/1803776421' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/2193766018' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/2193766018' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/2528169956' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/2528169956' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/29302298' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/29302298' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/2695482257' entity='client.admin' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Saving service ingress.rgw.default spec with placement count:2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Saving service node-exporter spec with placement *
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Saving service grafana spec with placement compute-0;count:1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Saving service prometheus spec with placement compute-0;count:1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Saving service alertmanager spec with placement compute-0;count:1
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/1143624271' entity='client.admin' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/3906855381' entity='client.admin' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/2854364725' entity='client.admin' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Deploying daemon mon.compute-2 on compute-2
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: Cluster is now healthy
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/2852887520' entity='client.admin' 
Jan 23 04:50:40 np0005593295 ceph-mon[75771]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 23 04:50:42 np0005593295 ceph-mon[75771]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Jan 23 04:50:42 np0005593295 ceph-mon[75771]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 23 04:50:42 np0005593295 ceph-mon[75771]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 23 04:50:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: mon.compute-0 calling monitor election
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: mon.compute-2 calling monitor election
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: overall HEALTH_OK
Jan 23 04:50:45 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:47 np0005593295 ceph-mon[75771]: Deploying daemon mon.compute-1 on compute-1
Jan 23 04:50:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 04:50:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 04:50:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 04:50:47 np0005593295 ceph-mon[75771]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 23 04:50:47 np0005593295 ceph-mon[75771]: paxos.1).electionLogic(10) init, last seen epoch 10
Jan 23 04:50:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:48 np0005593295 python3[75835]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:50:52 np0005593295 ceph-mon[75771]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Jan 23 04:50:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:50:53 np0005593295 ceph-mon[75771]: mon.compute-0 calling monitor election
Jan 23 04:50:53 np0005593295 ceph-mon[75771]: mon.compute-2 calling monitor election
Jan 23 04:50:53 np0005593295 ceph-mon[75771]: mon.compute-1 calling monitor election
Jan 23 04:50:53 np0005593295 ceph-mon[75771]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 04:50:53 np0005593295 ceph-mon[75771]: overall HEALTH_OK
Jan 23 04:50:53 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:53 np0005593295 podman[75941]: 2026-01-23 09:50:53.588742013 +0000 UTC m=+0.045999820 container create f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:50:53 np0005593295 systemd[72559]: Starting Mark boot as successful...
Jan 23 04:50:53 np0005593295 systemd[1]: Started libpod-conmon-f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d.scope.
Jan 23 04:50:53 np0005593295 systemd[72559]: Finished Mark boot as successful.
Jan 23 04:50:53 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:50:53 np0005593295 podman[75941]: 2026-01-23 09:50:53.644073039 +0000 UTC m=+0.101330876 container init f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:50:53 np0005593295 podman[75941]: 2026-01-23 09:50:53.650817889 +0000 UTC m=+0.108075696 container start f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 23 04:50:53 np0005593295 podman[75941]: 2026-01-23 09:50:53.654879699 +0000 UTC m=+0.112137596 container attach f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2)
Jan 23 04:50:53 np0005593295 priceless_satoshi[75958]: 167 167
Jan 23 04:50:53 np0005593295 systemd[1]: libpod-f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d.scope: Deactivated successfully.
Jan 23 04:50:53 np0005593295 podman[75941]: 2026-01-23 09:50:53.656750716 +0000 UTC m=+0.114008523 container died f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 04:50:53 np0005593295 podman[75941]: 2026-01-23 09:50:53.567126623 +0000 UTC m=+0.024384460 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:53 np0005593295 systemd[1]: var-lib-containers-storage-overlay-25153313b40cf1585c20291fc995ff662d6c6077010694753f60df16ac3882eb-merged.mount: Deactivated successfully.
Jan 23 04:50:53 np0005593295 podman[75941]: 2026-01-23 09:50:53.694186659 +0000 UTC m=+0.151444466 container remove f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_satoshi, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 04:50:53 np0005593295 systemd[1]: libpod-conmon-f47c4544ffeddc5df0a5cef2b2e271d916bf7fb494e7aef1691ff7f71bf04e6d.scope: Deactivated successfully.
Jan 23 04:50:53 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:53 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:53 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:54 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:54 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:54 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:54 np0005593295 systemd[1]: Starting Ceph mgr.compute-2.uczrot for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:50:54 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:54 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:54 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.uczrot", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:50:54 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.uczrot", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 04:50:54 np0005593295 ceph-mon[75771]: Deploying daemon mgr.compute-2.uczrot on compute-2
Jan 23 04:50:54 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/4282911488' entity='client.admin' 
Jan 23 04:50:54 np0005593295 podman[76100]: 2026-01-23 09:50:54.472107157 +0000 UTC m=+0.044699030 container create 493e3a3dda7766566066c301f8593d7d1e6e8d9c2ba535766866cb6825a13835 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid)
Jan 23 04:50:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60da7b19c78c8fa12520947c627d4cababc90b9bbf66afd8090d0a7ec372a5ae/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60da7b19c78c8fa12520947c627d4cababc90b9bbf66afd8090d0a7ec372a5ae/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60da7b19c78c8fa12520947c627d4cababc90b9bbf66afd8090d0a7ec372a5ae/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60da7b19c78c8fa12520947c627d4cababc90b9bbf66afd8090d0a7ec372a5ae/merged/var/lib/ceph/mgr/ceph-compute-2.uczrot supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:54 np0005593295 podman[76100]: 2026-01-23 09:50:54.52382901 +0000 UTC m=+0.096420893 container init 493e3a3dda7766566066c301f8593d7d1e6e8d9c2ba535766866cb6825a13835 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 04:50:54 np0005593295 podman[76100]: 2026-01-23 09:50:54.531748857 +0000 UTC m=+0.104340730 container start 493e3a3dda7766566066c301f8593d7d1e6e8d9c2ba535766866cb6825a13835 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:50:54 np0005593295 bash[76100]: 493e3a3dda7766566066c301f8593d7d1e6e8d9c2ba535766866cb6825a13835
Jan 23 04:50:54 np0005593295 podman[76100]: 2026-01-23 09:50:54.453249318 +0000 UTC m=+0.025841211 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:54 np0005593295 systemd[1]: Started Ceph mgr.compute-2.uczrot for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:50:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 23 04:50:54 np0005593295 ceph-mgr[76120]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:50:54 np0005593295 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:50:54 np0005593295 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 04:50:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 23 04:50:54 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 04:50:54 np0005593295 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:50:54 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 04:50:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:54.722+0000 7f857cdd3140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:50:54 np0005593295 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:50:54 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 04:50:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:54.812+0000 7f857cdd3140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:50:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1019912092 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:50:55 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 04:50:55 np0005593295 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:50:55 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 04:50:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:55.730+0000 7f857cdd3140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:56 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:56 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/3189222711' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 23 04:50:56 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:56 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.jmakme", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:50:56 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.jmakme", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 04:50:56 np0005593295 ceph-mon[75771]: Deploying daemon mgr.compute-1.jmakme on compute-1
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:50:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:56.424+0000 7f857cdd3140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:50:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:50:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:  from numpy import show_config as show_numpy_config
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 04:50:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:56.608+0000 7f857cdd3140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 04:50:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:56.680+0000 7f857cdd3140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:50:56 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:50:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:56.824+0000 7f857cdd3140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/3189222711' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 23 04:50:57 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/1618362368' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 23 04:50:57 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 04:50:57 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:50:57 np0005593295 podman[76242]: 2026-01-23 09:50:57.457640599 +0000 UTC m=+0.040714792 container create 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 23 04:50:57 np0005593295 systemd[1]: Started libpod-conmon-71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28.scope.
Jan 23 04:50:57 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:50:57 np0005593295 podman[76242]: 2026-01-23 09:50:57.438879342 +0000 UTC m=+0.021953555 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:57 np0005593295 podman[76242]: 2026-01-23 09:50:57.537214383 +0000 UTC m=+0.120288606 container init 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True)
Jan 23 04:50:57 np0005593295 podman[76242]: 2026-01-23 09:50:57.543872982 +0000 UTC m=+0.126947175 container start 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 04:50:57 np0005593295 podman[76242]: 2026-01-23 09:50:57.548094144 +0000 UTC m=+0.131168437 container attach 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:50:57 np0005593295 mystifying_dewdney[76259]: 167 167
Jan 23 04:50:57 np0005593295 systemd[1]: libpod-71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28.scope: Deactivated successfully.
Jan 23 04:50:57 np0005593295 podman[76242]: 2026-01-23 09:50:57.550740402 +0000 UTC m=+0.133814595 container died 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2)
Jan 23 04:50:57 np0005593295 systemd[1]: var-lib-containers-storage-overlay-60b98240dd3187f16bba69540ab298071e9949c26218fba674f832b87f664bae-merged.mount: Deactivated successfully.
Jan 23 04:50:57 np0005593295 podman[76242]: 2026-01-23 09:50:57.58379677 +0000 UTC m=+0.166870963 container remove 71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_dewdney, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 04:50:57 np0005593295 systemd[1]: libpod-conmon-71be559bbfdca95934b4dadc76f3f5bcc3a400d48a547b3dac1e6b29cc027d28.scope: Deactivated successfully.
Jan 23 04:50:57 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 04:50:57 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:57 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:57 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:57 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 04:50:57 np0005593295 systemd[1]: Reloading.
Jan 23 04:50:57 np0005593295 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:50:57 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:50:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:57.986+0000 7f857cdd3140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:50:57 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:50:58 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:50:58 np0005593295 systemd[1]: Starting Ceph crash.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.225+0000 7f857cdd3140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-mon[75771]: Deploying daemon crash.compute-2 on compute-2
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.309+0000 7f857cdd3140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 systemd[1]: session-26.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-20.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-25.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 26 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 20 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 25 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd[1]: session-27.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-31.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-23.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-22.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-28.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 26.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 27 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 32 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 22 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 31 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 23 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 28 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 20.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 25.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 24 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd[1]: session-24.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-29.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-30.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 29 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Session 30 logged out. Waiting for processes to exit.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 27.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 31.
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.389+0000 7f857cdd3140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 23.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 22.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 28.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 24.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 29.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 30.
Jan 23 04:50:58 np0005593295 podman[76399]: 2026-01-23 09:50:58.398090596 +0000 UTC m=+0.042916945 container create 044486c85d2f7920782c0ee61f8358742d2c669d4f1247ecac174b4901e18cb4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:50:58 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8afc076c02f5da977f90fdad2cc82748c899bf1be65a6b94ef039d15298fdf4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:58 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8afc076c02f5da977f90fdad2cc82748c899bf1be65a6b94ef039d15298fdf4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:58 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8afc076c02f5da977f90fdad2cc82748c899bf1be65a6b94ef039d15298fdf4/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:58 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8afc076c02f5da977f90fdad2cc82748c899bf1be65a6b94ef039d15298fdf4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:58 np0005593295 podman[76399]: 2026-01-23 09:50:58.461494901 +0000 UTC m=+0.106321270 container init 044486c85d2f7920782c0ee61f8358742d2c669d4f1247ecac174b4901e18cb4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 04:50:58 np0005593295 podman[76399]: 2026-01-23 09:50:58.466672138 +0000 UTC m=+0.111498487 container start 044486c85d2f7920782c0ee61f8358742d2c669d4f1247ecac174b4901e18cb4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 23 04:50:58 np0005593295 bash[76399]: 044486c85d2f7920782c0ee61f8358742d2c669d4f1247ecac174b4901e18cb4
Jan 23 04:50:58 np0005593295 podman[76399]: 2026-01-23 09:50:58.37808025 +0000 UTC m=+0.022906619 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:50:58 np0005593295 systemd[1]: Started Ceph crash.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.477+0000 7f857cdd3140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 23 04:50:58 np0005593295 systemd[1]: session-32.scope: Deactivated successfully.
Jan 23 04:50:58 np0005593295 systemd[1]: session-32.scope: Consumed 1min 22.769s CPU time.
Jan 23 04:50:58 np0005593295 systemd-logind[786]: Removed session 32.
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.553+0000 7f857cdd3140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.614+0000 7f57c5bd8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.614+0000 7f57c5bd8640 -1 AuthRegistry(0x7f57c0069b10) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.616+0000 7f57c5bd8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.616+0000 7f57c5bd8640 -1 AuthRegistry(0x7f57c5bd6ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.617+0000 7f57bffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.618+0000 7f57beffd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.619+0000 7f57bf7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: 2026-01-23T09:50:58.619+0000 7f57c5bd8640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-2[76416]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:50:58 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:50:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:58.935+0000 7f857cdd3140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:50:59 np0005593295 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:50:59 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 04:50:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:59.044+0000 7f857cdd3140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:50:59 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/1618362368' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 23 04:50:59 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 04:50:59 np0005593295 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:50:59 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 04:50:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:50:59.582+0000 7f857cdd3140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020052805 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 04:51:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.218+0000 7f857cdd3140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.295+0000 7f857cdd3140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.378+0000 7f857cdd3140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.557+0000 7f857cdd3140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 04:51:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.650+0000 7f857cdd3140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:00 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:51:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:00.847+0000 7f857cdd3140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 04:51:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:01.114+0000 7f857cdd3140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 04:51:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:01.498+0000 7f857cdd3140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:01.585+0000 7f857cdd3140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55f098898d00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  1: '-n'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  2: 'mgr.compute-2.uczrot'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  3: '-f'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  4: '--setuser'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  5: 'ceph'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  6: '--setgroup'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  7: 'ceph'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 04:51:01 np0005593295 ceph-mgr[76120]: mgr respawn  exe_path /proc/self/exe
Jan 23 04:51:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setuser ceph since I am not root
Jan 23 04:51:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setgroup ceph since I am not root
Jan 23 04:51:02 np0005593295 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:51:02 np0005593295 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 04:51:03 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 04:51:03 np0005593295 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 04:51:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:03.513+0000 7f3a29999140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593295 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:03 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 04:51:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:03.601+0000 7f3a29999140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:04 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 04:51:04 np0005593295 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:04 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 04:51:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:04.592+0000 7f3a29999140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054705 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:05.432+0000 7f3a29999140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:51:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:51:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:51:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:  from numpy import show_config as show_numpy_config
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:05.641+0000 7f3a29999140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 04:51:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:05.726+0000 7f3a29999140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:51:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:05.891+0000 7f3a29999140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 e32: 2 total, 2 up, 2 in
Jan 23 04:51:06 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 04:51:06 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:51:06 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 04:51:06 np0005593295 systemd-logind[786]: New session 33 of user ceph-admin.
Jan 23 04:51:06 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 04:51:06 np0005593295 systemd[1]: Started Session 33 of User ceph-admin.
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:51:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.033+0000 7f3a29999140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:51:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.290+0000 7f3a29999140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 04:51:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.371+0000 7f3a29999140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.448+0000 7f3a29999140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.543+0000 7f3a29999140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:07.621+0000 7f3a29999140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:07 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 04:51:07 np0005593295 ceph-mon[75771]: Active manager daemon compute-0.nbdygh restarted
Jan 23 04:51:07 np0005593295 ceph-mon[75771]: Activating manager daemon compute-0.nbdygh
Jan 23 04:51:07 np0005593295 podman[76594]: 2026-01-23 09:51:07.663429466 +0000 UTC m=+0.212839942 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Jan 23 04:51:07 np0005593295 podman[76594]: 2026-01-23 09:51:07.772199301 +0000 UTC m=+0.321609777 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:51:08 np0005593295 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:51:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:08.038+0000 7f3a29999140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593295 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:08.153+0000 7f3a29999140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 04:51:08 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 04:51:08 np0005593295 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:08.666+0000 7f3a29999140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:08 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: Manager daemon compute-0.nbdygh is now available
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:08 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.367+0000 7f3a29999140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.451+0000 7f3a29999140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.548+0000 7f3a29999140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.736+0000 7f3a29999140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 04:51:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.818+0000 7f3a29999140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Bus STARTING
Jan 23 04:51:09 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Serving on http://192.168.122.100:8765
Jan 23 04:51:09 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Serving on https://192.168.122.100:7150
Jan 23 04:51:09 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Bus STARTED
Jan 23 04:51:09 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:07] ENGINE Client ('192.168.122.100', 55612) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 04:51:09 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:09.986+0000 7f3a29999140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:09 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:51:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 04:51:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:10.239+0000 7f3a29999140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:10.560+0000 7f3a29999140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:10.646+0000 7f3a29999140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55e162952d00 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: mgr load Constructed class from module: dashboard
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Starting engine...
Jan 23 04:51:10 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Engine started...
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: Adjusting osd_memory_target on compute-0 to 127.9M
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: Unable to set osd_memory_target on compute-0 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: Adjusting osd_memory_target on compute-1 to 127.9M
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: Unable to set osd_memory_target on compute-1 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 04:51:13 np0005593295 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:14 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:15 np0005593295 ceph-mon[75771]: Deploying daemon node-exporter.compute-0 on compute-0
Jan 23 04:51:16 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:17 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:17 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:17 np0005593295 ceph-mon[75771]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:17 np0005593295 ceph-mon[75771]: Deploying daemon node-exporter.compute-1 on compute-1
Jan 23 04:51:17 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/1110789864' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  1: '-n'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  2: 'mgr.compute-2.uczrot'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  3: '-f'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  4: '--setuser'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  5: 'ceph'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  6: '--setgroup'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  7: 'ceph'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr respawn  exe_path /proc/self/exe
Jan 23 04:51:17 np0005593295 systemd[1]: session-33.scope: Deactivated successfully.
Jan 23 04:51:17 np0005593295 systemd[1]: session-33.scope: Consumed 4.234s CPU time.
Jan 23 04:51:17 np0005593295 systemd-logind[786]: Session 33 logged out. Waiting for processes to exit.
Jan 23 04:51:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setuser ceph since I am not root
Jan 23 04:51:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setgroup ceph since I am not root
Jan 23 04:51:17 np0005593295 systemd-logind[786]: Removed session 33.
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 04:51:17 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 04:51:18 np0005593295 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:18 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 04:51:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:18.006+0000 7fc4a9399140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:18.098+0000 7fc4a9399140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:18 np0005593295 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:18 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 04:51:18 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/1110789864' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 23 04:51:18 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/435334493' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 23 04:51:18 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 04:51:19 np0005593295 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:19.037+0000 7fc4a9399140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 04:51:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:51:19 np0005593295 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:19.734+0000 7fc4a9399140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:51:19 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/435334493' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 23 04:51:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:51:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:51:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:  from numpy import show_config as show_numpy_config
Jan 23 04:51:19 np0005593295 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:19.932+0000 7fc4a9399140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 04:51:20 np0005593295 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:20.019+0000 7fc4a9399140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:20 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 04:51:20 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 04:51:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:20 np0005593295 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:20.170+0000 7fc4a9399140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:20 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:51:20 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 04:51:20 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:51:20 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.331+0000 7fc4a9399140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.579+0000 7fc4a9399140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.658+0000 7fc4a9399140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 04:51:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.730+0000 7fc4a9399140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:51:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.822+0000 7fc4a9399140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 04:51:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:21.900+0000 7fc4a9399140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:21 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 04:51:22 np0005593295 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:22.289+0000 7fc4a9399140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:51:22 np0005593295 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:22.416+0000 7fc4a9399140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 04:51:22 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 04:51:22 np0005593295 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:22 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 04:51:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:22.933+0000 7fc4a9399140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.562+0000 7fc4a9399140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.637+0000 7fc4a9399140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 04:51:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.725+0000 7fc4a9399140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.888+0000 7fc4a9399140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:23 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 04:51:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:23.974+0000 7fc4a9399140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:24.153+0000 7fc4a9399140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 04:51:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:24.421+0000 7fc4a9399140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:24.725+0000 7fc4a9399140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 04:51:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:24.804+0000 7fc4a9399140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55caeacef860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  1: '-n'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  2: 'mgr.compute-2.uczrot'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  3: '-f'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  4: '--setuser'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  5: 'ceph'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  6: '--setgroup'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  7: 'ceph'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr respawn  exe_path /proc/self/exe
Jan 23 04:51:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setuser ceph since I am not root
Jan 23 04:51:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setgroup ceph since I am not root
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 04:51:24 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 04:51:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:25.081+0000 7f2757a28140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:25 np0005593295 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:51:25 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 04:51:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:25.189+0000 7f2757a28140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:25 np0005593295 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:51:25 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 04:51:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e33 e33: 2 total, 2 up, 2 in
Jan 23 04:51:25 np0005593295 ceph-mon[75771]: Active manager daemon compute-0.nbdygh restarted
Jan 23 04:51:25 np0005593295 ceph-mon[75771]: Activating manager daemon compute-0.nbdygh
Jan 23 04:51:26 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 04:51:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:26.189+0000 7f2757a28140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593295 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 04:51:26 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:51:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:26.932+0000 7f2757a28140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593295 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:51:26 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:51:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:51:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:51:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:  from numpy import show_config as show_numpy_config
Jan 23 04:51:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:27.158+0000 7f2757a28140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 04:51:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:27.251+0000 7f2757a28140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 04:51:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:27.414+0000 7f2757a28140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 04:51:27 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:51:28 np0005593295 systemd[1]: Stopping User Manager for UID 42477...
Jan 23 04:51:28 np0005593295 systemd[72559]: Activating special unit Exit the Session...
Jan 23 04:51:28 np0005593295 systemd[72559]: Stopped target Main User Target.
Jan 23 04:51:28 np0005593295 systemd[72559]: Stopped target Basic System.
Jan 23 04:51:28 np0005593295 systemd[72559]: Stopped target Paths.
Jan 23 04:51:28 np0005593295 systemd[72559]: Stopped target Sockets.
Jan 23 04:51:28 np0005593295 systemd[72559]: Stopped target Timers.
Jan 23 04:51:28 np0005593295 systemd[72559]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:51:28 np0005593295 systemd[72559]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:51:28 np0005593295 systemd[72559]: Closed D-Bus User Message Bus Socket.
Jan 23 04:51:28 np0005593295 systemd[72559]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:51:28 np0005593295 systemd[72559]: Removed slice User Application Slice.
Jan 23 04:51:28 np0005593295 systemd[72559]: Reached target Shutdown.
Jan 23 04:51:28 np0005593295 systemd[72559]: Finished Exit the Session.
Jan 23 04:51:28 np0005593295 systemd[72559]: Reached target Exit the Session.
Jan 23 04:51:28 np0005593295 systemd[1]: user@42477.service: Deactivated successfully.
Jan 23 04:51:28 np0005593295 systemd[1]: Stopped User Manager for UID 42477.
Jan 23 04:51:28 np0005593295 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Jan 23 04:51:28 np0005593295 systemd[1]: run-user-42477.mount: Deactivated successfully.
Jan 23 04:51:28 np0005593295 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Jan 23 04:51:28 np0005593295 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Jan 23 04:51:28 np0005593295 systemd[1]: Removed slice User Slice of UID 42477.
Jan 23 04:51:28 np0005593295 systemd[1]: user-42477.slice: Consumed 1min 28.092s CPU time.
Jan 23 04:51:28 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 04:51:28 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 04:51:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:28.602+0000 7f2757a28140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593295 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:51:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:28.874+0000 7f2757a28140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593295 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:51:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:28.973+0000 7f2757a28140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:51:28 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 04:51:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.069+0000 7f2757a28140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:51:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.152+0000 7f2757a28140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 04:51:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.235+0000 7f2757a28140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 04:51:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.605+0000 7f2757a28140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:51:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:29.719+0000 7f2757a28140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 04:51:29 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 04:51:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:30.258+0000 7f2757a28140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593295 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 04:51:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:30.943+0000 7f2757a28140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593295 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:51:30 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 04:51:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.029+0000 7f2757a28140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:51:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.133+0000 7f2757a28140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 04:51:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.315+0000 7f2757a28140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 04:51:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.394+0000 7f2757a28140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 04:51:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.571+0000 7f2757a28140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:51:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:31.815+0000 7f2757a28140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:51:31 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 04:51:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:32.129+0000 7f2757a28140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 04:51:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:51:32.223+0000 7f2757a28140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: mgr load Constructed class from module: dashboard
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Starting engine...
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55cfcdde1860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 23 04:51:32 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Engine started...
Jan 23 04:51:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e34 e34: 2 total, 2 up, 2 in
Jan 23 04:51:33 np0005593295 ceph-mon[75771]: Active manager daemon compute-0.nbdygh restarted
Jan 23 04:51:33 np0005593295 ceph-mon[75771]: Activating manager daemon compute-0.nbdygh
Jan 23 04:51:33 np0005593295 ceph-mon[75771]: Manager daemon compute-0.nbdygh is now available
Jan 23 04:51:33 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 04:51:33 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 04:51:33 np0005593295 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 04:51:33 np0005593295 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 04:51:33 np0005593295 systemd-logind[786]: New session 34 of user ceph-admin.
Jan 23 04:51:33 np0005593295 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 04:51:33 np0005593295 systemd[1]: Starting User Manager for UID 42477...
Jan 23 04:51:33 np0005593295 systemd[77796]: Queued start job for default target Main User Target.
Jan 23 04:51:33 np0005593295 systemd[77796]: Created slice User Application Slice.
Jan 23 04:51:33 np0005593295 systemd[77796]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:51:33 np0005593295 systemd[77796]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:51:33 np0005593295 systemd[77796]: Reached target Paths.
Jan 23 04:51:33 np0005593295 systemd[77796]: Reached target Timers.
Jan 23 04:51:33 np0005593295 systemd[77796]: Starting D-Bus User Message Bus Socket...
Jan 23 04:51:33 np0005593295 systemd[77796]: Starting Create User's Volatile Files and Directories...
Jan 23 04:51:33 np0005593295 systemd[77796]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:51:33 np0005593295 systemd[77796]: Reached target Sockets.
Jan 23 04:51:33 np0005593295 systemd[77796]: Finished Create User's Volatile Files and Directories.
Jan 23 04:51:33 np0005593295 systemd[77796]: Reached target Basic System.
Jan 23 04:51:33 np0005593295 systemd[77796]: Reached target Main User Target.
Jan 23 04:51:33 np0005593295 systemd[77796]: Startup finished in 127ms.
Jan 23 04:51:33 np0005593295 systemd[1]: Started User Manager for UID 42477.
Jan 23 04:51:33 np0005593295 systemd[1]: Started Session 34 of User ceph-admin.
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e2 new map
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e2 print_map#012e2#012btime 2026-01-23T09:51:34:000852+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:51:34.000760+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e35 e35: 2 total, 2 up, 2 in
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 23 04:51:34 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:34 np0005593295 podman[77933]: 2026-01-23 09:51:34.722271397 +0000 UTC m=+0.079571015 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:34 np0005593295 podman[77933]: 2026-01-23 09:51:34.829184525 +0000 UTC m=+0.186484133 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Bus STARTING
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Serving on http://192.168.122.100:8765
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Serving on https://192.168.122.100:7150
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Bus STARTED
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:51:34] ENGINE Client ('192.168.122.100', 48072) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:35 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e36 e36: 2 total, 2 up, 2 in
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Jan 23 04:51:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Jan 23 04:51:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e37 e37: 2 total, 2 up, 2 in
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e38 e38: 2 total, 2 up, 2 in
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:40 np0005593295 systemd[1]: Reloading.
Jan 23 04:51:40 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:51:40 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:51:40 np0005593295 systemd[1]: Reloading.
Jan 23 04:51:40 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:51:40 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:51:40 np0005593295 systemd[1]: Starting Ceph node-exporter.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:51:40 np0005593295 bash[79260]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Jan 23 04:51:41 np0005593295 ceph-mon[75771]: Deploying daemon node-exporter.compute-2 on compute-2
Jan 23 04:51:41 np0005593295 bash[79260]: Getting image source signatures
Jan 23 04:51:41 np0005593295 bash[79260]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Jan 23 04:51:41 np0005593295 bash[79260]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Jan 23 04:51:41 np0005593295 bash[79260]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Jan 23 04:51:43 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/992291970' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 23 04:51:43 np0005593295 bash[79260]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Jan 23 04:51:43 np0005593295 bash[79260]: Writing manifest to image destination
Jan 23 04:51:43 np0005593295 podman[79260]: 2026-01-23 09:51:43.983179142 +0000 UTC m=+3.122504866 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 23 04:51:44 np0005593295 podman[79260]: 2026-01-23 09:51:44.147557358 +0000 UTC m=+3.286883062 container create 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:51:44 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47d514d8f55bda982f888d0e7f03ebab6be03e078204b04f7e76c86d45f56d75/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:44 np0005593295 podman[79260]: 2026-01-23 09:51:44.237366071 +0000 UTC m=+3.376691785 container init 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:51:44 np0005593295 podman[79260]: 2026-01-23 09:51:44.242201258 +0000 UTC m=+3.381526952 container start 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:51:44 np0005593295 bash[79260]: 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0
Jan 23 04:51:44 np0005593295 systemd[1]: Started Ceph node-exporter.compute-2 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.270Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.270Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.270Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.270Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.271Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.271Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=arp
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=bcache
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=bonding
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=cpu
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=dmi
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=edac
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=entropy
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=filefd
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=netclass
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=netdev
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=netstat
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=nfs
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=nvme
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=os
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=pressure
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=rapl
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=selinux
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=softnet
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=stat
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=textfile
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=time
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=uname
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=xfs
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.272Z caller=node_exporter.go:117 level=info collector=zfs
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.273Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 23 04:51:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2[79335]: ts=2026-01-23T09:51:44.273Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 23 04:51:44 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/992291970' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 23 04:51:44 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:45 np0005593295 podman[79432]: 2026-01-23 09:51:45.845432886 +0000 UTC m=+0.038935176 container create be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:45 np0005593295 systemd[1]: Started libpod-conmon-be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6.scope.
Jan 23 04:51:45 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:51:45 np0005593295 podman[79432]: 2026-01-23 09:51:45.912305092 +0000 UTC m=+0.105807402 container init be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:51:45 np0005593295 podman[79432]: 2026-01-23 09:51:45.918478183 +0000 UTC m=+0.111980473 container start be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:45 np0005593295 friendly_hamilton[79448]: 167 167
Jan 23 04:51:45 np0005593295 systemd[1]: libpod-be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6.scope: Deactivated successfully.
Jan 23 04:51:45 np0005593295 podman[79432]: 2026-01-23 09:51:45.924494918 +0000 UTC m=+0.117997238 container attach be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 04:51:45 np0005593295 podman[79432]: 2026-01-23 09:51:45.828024754 +0000 UTC m=+0.021527064 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:51:45 np0005593295 podman[79432]: 2026-01-23 09:51:45.924774955 +0000 UTC m=+0.118277265 container died be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 23 04:51:45 np0005593295 systemd[1]: var-lib-containers-storage-overlay-d4ba7fbffb29616fb715adf85dba1eb366bec505e99f037e54b2485607177553-merged.mount: Deactivated successfully.
Jan 23 04:51:45 np0005593295 podman[79432]: 2026-01-23 09:51:45.968737403 +0000 UTC m=+0.162239683 container remove be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=friendly_hamilton, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 04:51:45 np0005593295 systemd[1]: libpod-conmon-be7108f60bdb8f8cf9a1ba541ae5e493a168a337a337e6d498330cc2d461c9c6.scope: Deactivated successfully.
Jan 23 04:51:46 np0005593295 podman[79471]: 2026-01-23 09:51:46.124803827 +0000 UTC m=+0.040137007 container create 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:51:46 np0005593295 systemd[1]: Started libpod-conmon-589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f.scope.
Jan 23 04:51:46 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:51:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:46 np0005593295 podman[79471]: 2026-01-23 09:51:46.193307642 +0000 UTC m=+0.108640842 container init 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:51:46 np0005593295 podman[79471]: 2026-01-23 09:51:46.199903713 +0000 UTC m=+0.115236893 container start 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:46 np0005593295 podman[79471]: 2026-01-23 09:51:46.110162851 +0000 UTC m=+0.025496051 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:51:46 np0005593295 podman[79471]: 2026-01-23 09:51:46.207531688 +0000 UTC m=+0.122864868 container attach 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 23 04:51:46 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:46 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:46 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:46 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:51:46 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:51:46 np0005593295 trusting_bose[79487]: --> passed data devices: 0 physical, 1 LVM
Jan 23 04:51:46 np0005593295 trusting_bose[79487]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:46 np0005593295 trusting_bose[79487]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:46 np0005593295 trusting_bose[79487]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 2edb8fa1-89ea-44cd-9b6e-9f4d89095397
Jan 23 04:51:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"} v 0)
Jan 23 04:51:47 np0005593295 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1205331151' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 04:51:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e39 e39: 3 total, 2 up, 3 in
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:48 np0005593295 lvm[79552]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:51:48 np0005593295 lvm[79552]: VG ceph_vg0 finished
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Jan 23 04:51:48 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.102:0/1205331151' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 04:51:48 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 04:51:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Jan 23 04:51:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3212942412' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: stderr: got monmap epoch 3
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: --> Creating keyring file for osd.2
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Jan 23 04:51:48 np0005593295 trusting_bose[79487]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 2edb8fa1-89ea-44cd-9b6e-9f4d89095397 --setuser ceph --setgroup ceph
Jan 23 04:51:49 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]': finished
Jan 23 04:51:49 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:49 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/3560526778' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 23 04:51:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:53 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:53 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: stderr: 2026-01-23T09:51:48.820+0000 7f0326444740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: stderr: 2026-01-23T09:51:49.082+0000 7f0326444740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 23 04:51:53 np0005593295 trusting_bose[79487]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 23 04:51:53 np0005593295 systemd[1]: libpod-589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f.scope: Deactivated successfully.
Jan 23 04:51:53 np0005593295 podman[79471]: 2026-01-23 09:51:53.638161115 +0000 UTC m=+7.553494295 container died 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 23 04:51:53 np0005593295 systemd[1]: libpod-589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f.scope: Consumed 3.976s CPU time.
Jan 23 04:51:53 np0005593295 systemd[1]: var-lib-containers-storage-overlay-cbe30176b428903d0592b42422c0ecd0f0bce2f81248b2af5644eb8b05a70154-merged.mount: Deactivated successfully.
Jan 23 04:51:53 np0005593295 podman[79471]: 2026-01-23 09:51:53.720536508 +0000 UTC m=+7.635869688 container remove 589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_bose, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Jan 23 04:51:53 np0005593295 systemd[1]: libpod-conmon-589c9f08c1696c69651ddfcaf0af378fa4e5a67bebed1db0cd2030d4a93f9d4f.scope: Deactivated successfully.
Jan 23 04:51:54 np0005593295 podman[80562]: 2026-01-23 09:51:54.279903823 +0000 UTC m=+0.042832181 container create 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 04:51:54 np0005593295 systemd[1]: Started libpod-conmon-5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5.scope.
Jan 23 04:51:54 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:51:54 np0005593295 podman[80562]: 2026-01-23 09:51:54.355322626 +0000 UTC m=+0.118251014 container init 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:51:54 np0005593295 podman[80562]: 2026-01-23 09:51:54.260056631 +0000 UTC m=+0.022985019 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:51:54 np0005593295 podman[80562]: 2026-01-23 09:51:54.363308991 +0000 UTC m=+0.126237349 container start 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Jan 23 04:51:54 np0005593295 hardcore_tharp[80581]: 167 167
Jan 23 04:51:54 np0005593295 systemd[1]: libpod-5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5.scope: Deactivated successfully.
Jan 23 04:51:54 np0005593295 conmon[80581]: conmon 5b2080acc631c1cc0b8e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5.scope/container/memory.events
Jan 23 04:51:54 np0005593295 podman[80562]: 2026-01-23 09:51:54.370951236 +0000 UTC m=+0.133879594 container attach 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 23 04:51:54 np0005593295 podman[80562]: 2026-01-23 09:51:54.371311956 +0000 UTC m=+0.134240314 container died 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid)
Jan 23 04:51:54 np0005593295 systemd[1]: var-lib-containers-storage-overlay-65f4ac93920b8b2fcc5dce3a3607ee4121ba998292d3a5b2d6e7e331d3bc65a0-merged.mount: Deactivated successfully.
Jan 23 04:51:54 np0005593295 podman[80562]: 2026-01-23 09:51:54.439205855 +0000 UTC m=+0.202134213 container remove 5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hardcore_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:51:54 np0005593295 systemd[1]: libpod-conmon-5b2080acc631c1cc0b8eb238fc7df026d3c90f64b5cb878c1921b92e7bf1d5e5.scope: Deactivated successfully.
Jan 23 04:51:54 np0005593295 podman[80605]: 2026-01-23 09:51:54.614828745 +0000 UTC m=+0.048443629 container create 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 04:51:54 np0005593295 systemd[1]: Started libpod-conmon-1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e.scope.
Jan 23 04:51:54 np0005593295 podman[80605]: 2026-01-23 09:51:54.595150476 +0000 UTC m=+0.028765370 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:51:54 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:51:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:54 np0005593295 podman[80605]: 2026-01-23 09:51:54.735228831 +0000 UTC m=+0.168843725 container init 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:51:54 np0005593295 podman[80605]: 2026-01-23 09:51:54.744461395 +0000 UTC m=+0.178076269 container start 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default)
Jan 23 04:51:54 np0005593295 podman[80605]: 2026-01-23 09:51:54.748988635 +0000 UTC m=+0.182603509 container attach 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:51:55 np0005593295 great_clarke[80621]: {
Jan 23 04:51:55 np0005593295 great_clarke[80621]:    "2": [
Jan 23 04:51:55 np0005593295 great_clarke[80621]:        {
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "devices": [
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "/dev/loop3"
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            ],
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "lv_name": "ceph_lv0",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "lv_size": "21470642176",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=2wFOwd-HcwO-2lSY-8RBi-SMwa-NPkg-tiq3o8,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=f3005f84-239a-55b6-a948-8f1fb592b920,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=2edb8fa1-89ea-44cd-9b6e-9f4d89095397,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "lv_uuid": "2wFOwd-HcwO-2lSY-8RBi-SMwa-NPkg-tiq3o8",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "name": "ceph_lv0",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "tags": {
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.block_uuid": "2wFOwd-HcwO-2lSY-8RBi-SMwa-NPkg-tiq3o8",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.cephx_lockbox_secret": "",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.cluster_fsid": "f3005f84-239a-55b6-a948-8f1fb592b920",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.cluster_name": "ceph",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.crush_device_class": "",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.encrypted": "0",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.osd_fsid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.osd_id": "2",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.type": "block",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.vdo": "0",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:                "ceph.with_tpm": "0"
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            },
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "type": "block",
Jan 23 04:51:55 np0005593295 great_clarke[80621]:            "vg_name": "ceph_vg0"
Jan 23 04:51:55 np0005593295 great_clarke[80621]:        }
Jan 23 04:51:55 np0005593295 great_clarke[80621]:    ]
Jan 23 04:51:55 np0005593295 great_clarke[80621]: }
Jan 23 04:51:55 np0005593295 systemd[1]: libpod-1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e.scope: Deactivated successfully.
Jan 23 04:51:55 np0005593295 podman[80605]: 2026-01-23 09:51:55.110526391 +0000 UTC m=+0.544141275 container died 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:51:55 np0005593295 systemd[1]: var-lib-containers-storage-overlay-e1ca2ea7a5b2d8517e78735b6ceaa0e6e58f570e813fa6ea07f63f6dbb9d27b5-merged.mount: Deactivated successfully.
Jan 23 04:51:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:51:55 np0005593295 podman[80605]: 2026-01-23 09:51:55.165987978 +0000 UTC m=+0.599602852 container remove 1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_clarke, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Jan 23 04:51:55 np0005593295 systemd[1]: libpod-conmon-1e34501b689359b4a380b772c1fce82799193b754b5197e4ae0a7d2382179b1e.scope: Deactivated successfully.
Jan 23 04:51:55 np0005593295 podman[80734]: 2026-01-23 09:51:55.780695695 +0000 UTC m=+0.045407369 container create 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:51:55 np0005593295 systemd[1]: Started libpod-conmon-7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266.scope.
Jan 23 04:51:55 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:51:55 np0005593295 podman[80734]: 2026-01-23 09:51:55.755824424 +0000 UTC m=+0.020536128 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:51:55 np0005593295 podman[80734]: 2026-01-23 09:51:55.863493262 +0000 UTC m=+0.128204966 container init 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:51:55 np0005593295 podman[80734]: 2026-01-23 09:51:55.871696257 +0000 UTC m=+0.136407941 container start 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:51:55 np0005593295 podman[80734]: 2026-01-23 09:51:55.875425986 +0000 UTC m=+0.140137690 container attach 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 23 04:51:55 np0005593295 brave_sanderson[80750]: 167 167
Jan 23 04:51:55 np0005593295 systemd[1]: libpod-7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266.scope: Deactivated successfully.
Jan 23 04:51:55 np0005593295 podman[80734]: 2026-01-23 09:51:55.879628226 +0000 UTC m=+0.144339900 container died 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 04:51:55 np0005593295 systemd[1]: var-lib-containers-storage-overlay-3edd05407961d56181044db07bf2b7fd31c013f92930dfe7345aa8510c633b3e-merged.mount: Deactivated successfully.
Jan 23 04:51:56 np0005593295 podman[80734]: 2026-01-23 09:51:56.01823654 +0000 UTC m=+0.282948214 container remove 7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:56 np0005593295 systemd[1]: libpod-conmon-7cabe110c3b957c02879fe36945f105f1bd31899cf8ff83590326e646c8b5266.scope: Deactivated successfully.
Jan 23 04:51:56 np0005593295 podman[80781]: 2026-01-23 09:51:56.270103685 +0000 UTC m=+0.043677759 container create cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 23 04:51:56 np0005593295 systemd[1]: Started libpod-conmon-cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514.scope.
Jan 23 04:51:56 np0005593295 podman[80781]: 2026-01-23 09:51:56.253081691 +0000 UTC m=+0.026655765 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:51:56 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:51:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:56 np0005593295 podman[80781]: 2026-01-23 09:51:56.385061997 +0000 UTC m=+0.158636101 container init cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 23 04:51:56 np0005593295 podman[80781]: 2026-01-23 09:51:56.39237176 +0000 UTC m=+0.165945834 container start cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:51:56 np0005593295 podman[80781]: 2026-01-23 09:51:56.413743228 +0000 UTC m=+0.187317302 container attach cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:51:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test[80797]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Jan 23 04:51:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test[80797]:                            [--no-systemd] [--no-tmpfs]
Jan 23 04:51:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test[80797]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 23 04:51:56 np0005593295 systemd[1]: libpod-cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514.scope: Deactivated successfully.
Jan 23 04:51:56 np0005593295 podman[80781]: 2026-01-23 09:51:56.575535203 +0000 UTC m=+0.349109287 container died cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:51:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 23 04:51:56 np0005593295 ceph-mon[75771]: Deploying daemon osd.2 on compute-2
Jan 23 04:51:56 np0005593295 systemd[1]: var-lib-containers-storage-overlay-70b3598d9336858d8d429b6f0ae6a9d341eb1d40de1759b181ae9a8224d1bf90-merged.mount: Deactivated successfully.
Jan 23 04:51:56 np0005593295 podman[80781]: 2026-01-23 09:51:56.615505342 +0000 UTC m=+0.389079426 container remove cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 23 04:51:56 np0005593295 systemd[1]: libpod-conmon-cfa5b3a306552e3a46183377894b08efb034c3ad936be745e252328e9a254514.scope: Deactivated successfully.
Jan 23 04:51:56 np0005593295 systemd[1]: Reloading.
Jan 23 04:51:56 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:51:56 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:51:57 np0005593295 systemd[1]: Reloading.
Jan 23 04:51:57 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:51:57 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:51:57 np0005593295 systemd[1]: Starting Ceph osd.2 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:51:57 np0005593295 podman[80955]: 2026-01-23 09:51:57.706816975 +0000 UTC m=+0.040776050 container create cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:57 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:51:57 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:57 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:57 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:57 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:57 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:57 np0005593295 podman[80955]: 2026-01-23 09:51:57.687421583 +0000 UTC m=+0.021380698 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:51:57 np0005593295 podman[80955]: 2026-01-23 09:51:57.800117862 +0000 UTC m=+0.134076947 container init cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 23 04:51:57 np0005593295 podman[80955]: 2026-01-23 09:51:57.806226447 +0000 UTC m=+0.140185512 container start cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:51:57 np0005593295 podman[80955]: 2026-01-23 09:51:57.809676018 +0000 UTC m=+0.143635123 container attach cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 23 04:51:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:57 np0005593295 bash[80955]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:58 np0005593295 bash[80955]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:58 np0005593295 lvm[81051]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:51:58 np0005593295 lvm[81051]: VG ceph_vg0 finished
Jan 23 04:51:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 23 04:51:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:58 np0005593295 bash[80955]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 23 04:51:58 np0005593295 bash[80955]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:58 np0005593295 bash[80955]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:51:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:51:58 np0005593295 bash[80955]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:51:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 23 04:51:58 np0005593295 bash[80955]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 23 04:51:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:59 np0005593295 bash[80955]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:59 np0005593295 bash[80955]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:51:59 np0005593295 bash[80955]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:51:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:51:59 np0005593295 bash[80955]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:51:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate[80969]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 23 04:51:59 np0005593295 bash[80955]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 23 04:51:59 np0005593295 systemd[1]: libpod-cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642.scope: Deactivated successfully.
Jan 23 04:51:59 np0005593295 podman[80955]: 2026-01-23 09:51:59.085489255 +0000 UTC m=+1.419448330 container died cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:51:59 np0005593295 systemd[1]: libpod-cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642.scope: Consumed 1.359s CPU time.
Jan 23 04:51:59 np0005593295 systemd[1]: var-lib-containers-storage-overlay-ee275f3653f7db06acf72ad0d0779d48fb272855acbf54673ba74ab31a020a8e-merged.mount: Deactivated successfully.
Jan 23 04:51:59 np0005593295 podman[80955]: 2026-01-23 09:51:59.140363679 +0000 UTC m=+1.474322754 container remove cfdc7362c938cf1fdf58c6f7b2188cd40a747c025b0ba29f9707a03cde81f642 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2-activate, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:51:59 np0005593295 podman[81211]: 2026-01-23 09:51:59.352491569 +0000 UTC m=+0.046417854 container create 2fe854da164488d1d6d6dc2938b8a7e9e3832f2f3f02511f769527c4b75bc72f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:51:59 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:59 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:59 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:59 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:59 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36d9800ec1622e14d7b1636fbece62a9ba7a23105468dbe37503c04383eeaa8d/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:59 np0005593295 podman[81211]: 2026-01-23 09:51:59.40635481 +0000 UTC m=+0.100281095 container init 2fe854da164488d1d6d6dc2938b8a7e9e3832f2f3f02511f769527c4b75bc72f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Jan 23 04:51:59 np0005593295 podman[81211]: 2026-01-23 09:51:59.41521809 +0000 UTC m=+0.109144375 container start 2fe854da164488d1d6d6dc2938b8a7e9e3832f2f3f02511f769527c4b75bc72f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 23 04:51:59 np0005593295 bash[81211]: 2fe854da164488d1d6d6dc2938b8a7e9e3832f2f3f02511f769527c4b75bc72f
Jan 23 04:51:59 np0005593295 podman[81211]: 2026-01-23 09:51:59.331280415 +0000 UTC m=+0.025206700 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:51:59 np0005593295 systemd[1]: Started Ceph osd.2 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: pidfile_write: ignore empty --pid-file
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:51:59 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1b800 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559222c1bc00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: load: jerasure load: lrc 
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:52:00 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a92c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount shared_bdev_used = 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Git sha 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: DB SUMMARY
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: DB Session ID:  BA6EM20ZQ6TLD6WYVFWO
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                                     Options.env: 0x559222c6e770
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                                Options.info_log: 0x559223a979e0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.write_buffer_manager: 0x559223b82a00
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.row_cache: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                              Options.wal_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.wal_compression: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.max_background_jobs: 4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Compression algorithms supported:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kZSTD supported: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97da0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97dc0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb09b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97dc0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb09b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97dc0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb09b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b695fe5f-810b-4432-8e0d-0ba463e0cde8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161921737516, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161921737828, "job": 1, "event": "recovery_finished"}
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: freelist init
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: freelist _read_cfg
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs umount
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bdev(0x559223a93000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluefs mount shared_bdev_used = 4718592
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Git sha 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: DB SUMMARY
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: DB Session ID:  BA6EM20ZQ6TLD6WYVFWP
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                                     Options.env: 0x559222c6ed90
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                                Options.info_log: 0x559223c42760
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.write_buffer_manager: 0x559223b82a00
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.row_cache: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                              Options.wal_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.wal_compression: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.max_background_jobs: 4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Compression algorithms supported:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kZSTD supported: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:01 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a978c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb1350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97d00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb09b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97d00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb09b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:           Options.merge_operator: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559223a97d00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x559222cb09b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.compression: LZ4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.num_levels: 7
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b695fe5f-810b-4432-8e0d-0ba463e0cde8
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922007715, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922012650, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161922, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b695fe5f-810b-4432-8e0d-0ba463e0cde8", "db_session_id": "BA6EM20ZQ6TLD6WYVFWP", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922016477, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161922, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b695fe5f-810b-4432-8e0d-0ba463e0cde8", "db_session_id": "BA6EM20ZQ6TLD6WYVFWP", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922021452, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161922, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b695fe5f-810b-4432-8e0d-0ba463e0cde8", "db_session_id": "BA6EM20ZQ6TLD6WYVFWP", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161922024567, "job": 1, "event": "recovery_finished"}
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x559223dee000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: DB pointer 0x559223dd0000
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 460.80 MB usag
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: _get_class not permitted to load lua
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: _get_class not permitted to load sdk
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: osd.2 0 load_pgs
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: osd.2 0 load_pgs opened 0 pgs
Jan 23 04:52:02 np0005593295 ceph-osd[81231]: osd.2 0 log_to_monitors true
Jan 23 04:52:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2[81227]: 2026-01-23T09:52:02.056+0000 7f0b54478740 -1 osd.2 0 log_to_monitors true
Jan 23 04:52:02 np0005593295 podman[81740]: 2026-01-23 09:52:02.067796802 +0000 UTC m=+0.043122186 container create dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:52:02 np0005593295 podman[81740]: 2026-01-23 09:52:02.046369002 +0000 UTC m=+0.021694416 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:02 np0005593295 systemd[1]: Started libpod-conmon-dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69.scope.
Jan 23 04:52:02 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:52:02 np0005593295 podman[81740]: 2026-01-23 09:52:02.359593315 +0000 UTC m=+0.334918729 container init dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:52:02 np0005593295 podman[81740]: 2026-01-23 09:52:02.367057742 +0000 UTC m=+0.342383126 container start dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 04:52:02 np0005593295 podman[81740]: 2026-01-23 09:52:02.371463967 +0000 UTC m=+0.346789371 container attach dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:52:02 np0005593295 jolly_meninsky[81789]: 167 167
Jan 23 04:52:02 np0005593295 systemd[1]: libpod-dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69.scope: Deactivated successfully.
Jan 23 04:52:02 np0005593295 podman[81740]: 2026-01-23 09:52:02.372615244 +0000 UTC m=+0.347940628 container died dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:52:02 np0005593295 systemd[1]: var-lib-containers-storage-overlay-250279687aeb2cd740afdfbb55d22bf5e55e159c702a5771f0b6df12a1c879b1-merged.mount: Deactivated successfully.
Jan 23 04:52:02 np0005593295 podman[81740]: 2026-01-23 09:52:02.442777651 +0000 UTC m=+0.418103035 container remove dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_meninsky, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 04:52:02 np0005593295 systemd[1]: libpod-conmon-dff58781b88df0d396af51ead4fcda5420dcb9b234607b1ddd2c4800e6217e69.scope: Deactivated successfully.
Jan 23 04:52:02 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:02 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:02 np0005593295 podman[81815]: 2026-01-23 09:52:02.592042509 +0000 UTC m=+0.039637033 container create 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Jan 23 04:52:02 np0005593295 systemd[1]: Started libpod-conmon-5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a.scope.
Jan 23 04:52:02 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:52:02 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:02 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:02 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:02 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:02 np0005593295 podman[81815]: 2026-01-23 09:52:02.574157163 +0000 UTC m=+0.021751607 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:02 np0005593295 podman[81815]: 2026-01-23 09:52:02.671595399 +0000 UTC m=+0.119189843 container init 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Jan 23 04:52:02 np0005593295 podman[81815]: 2026-01-23 09:52:02.684329002 +0000 UTC m=+0.131923426 container start 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:52:02 np0005593295 podman[81815]: 2026-01-23 09:52:02.730196332 +0000 UTC m=+0.177790776 container attach 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 04:52:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Jan 23 04:52:02 np0005593295 ceph-mon[75771]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:52:03 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 23 04:52:03 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 23 04:52:03 np0005593295 lvm[81904]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:52:03 np0005593295 lvm[81904]: VG ceph_vg0 finished
Jan 23 04:52:03 np0005593295 naughty_booth[81831]: {}
Jan 23 04:52:03 np0005593295 systemd[1]: libpod-5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a.scope: Deactivated successfully.
Jan 23 04:52:03 np0005593295 systemd[1]: libpod-5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a.scope: Consumed 1.107s CPU time.
Jan 23 04:52:03 np0005593295 podman[81815]: 2026-01-23 09:52:03.411798667 +0000 UTC m=+0.859393111 container died 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Jan 23 04:52:03 np0005593295 systemd[1]: var-lib-containers-storage-overlay-cd736dbfd702450691a7eef8bb914199da96d67d4d8df264e07f53afa050b113-merged.mount: Deactivated successfully.
Jan 23 04:52:03 np0005593295 podman[81815]: 2026-01-23 09:52:03.54484295 +0000 UTC m=+0.992437374 container remove 5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_booth, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:52:03 np0005593295 systemd[1]: libpod-conmon-5f9c2070c21515a857c445d3c2108add9bc71d52845d31b03f8a20035765ce1a.scope: Deactivated successfully.
Jan 23 04:52:03 np0005593295 ceph-mon[75771]: from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:52:03 np0005593295 ceph-mon[75771]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:52:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e40 e40: 3 total, 2 up, 3 in
Jan 23 04:52:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Jan 23 04:52:04 np0005593295 ceph-mon[75771]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:52:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e41 e41: 3 total, 2 up, 3 in
Jan 23 04:52:07 np0005593295 ceph-mon[75771]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 23 04:52:07 np0005593295 ceph-mon[75771]: from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:52:07 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:07 np0005593295 ceph-mon[75771]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:52:07 np0005593295 ceph-osd[81231]: osd.2 0 done with init, starting boot process
Jan 23 04:52:07 np0005593295 ceph-osd[81231]: osd.2 0 start_boot
Jan 23 04:52:07 np0005593295 ceph-osd[81231]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 23 04:52:07 np0005593295 ceph-osd[81231]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 23 04:52:07 np0005593295 ceph-osd[81231]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 23 04:52:07 np0005593295 ceph-osd[81231]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 23 04:52:07 np0005593295 ceph-osd[81231]: osd.2 0  bench count 12288000 bsize 4 KiB
Jan 23 04:52:07 np0005593295 podman[82009]: 2026-01-23 09:52:07.854793444 +0000 UTC m=+0.039866969 container create bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:52:07 np0005593295 systemd[1]: Started libpod-conmon-bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586.scope.
Jan 23 04:52:07 np0005593295 podman[82009]: 2026-01-23 09:52:07.837273087 +0000 UTC m=+0.022346642 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:07 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:52:07 np0005593295 podman[82009]: 2026-01-23 09:52:07.989225338 +0000 UTC m=+0.174298863 container init bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 23 04:52:07 np0005593295 podman[82009]: 2026-01-23 09:52:07.996802178 +0000 UTC m=+0.181875703 container start bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 04:52:08 np0005593295 great_maxwell[82024]: 167 167
Jan 23 04:52:08 np0005593295 systemd[1]: libpod-bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586.scope: Deactivated successfully.
Jan 23 04:52:08 np0005593295 podman[82009]: 2026-01-23 09:52:08.017791517 +0000 UTC m=+0.202865042 container attach bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:52:08 np0005593295 podman[82009]: 2026-01-23 09:52:08.01918445 +0000 UTC m=+0.204257975 container died bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 23 04:52:08 np0005593295 systemd[1]: var-lib-containers-storage-overlay-78fa74605695d3d171647444fa256abe097174f2b854c46c3f6d5605141fb40f-merged.mount: Deactivated successfully.
Jan 23 04:52:08 np0005593295 podman[82009]: 2026-01-23 09:52:08.172238166 +0000 UTC m=+0.357311691 container remove bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_maxwell, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 04:52:08 np0005593295 systemd[1]: libpod-conmon-bccb3c5c85df11e48a5e9073a3030819a1f765eb1201f05c97a7895048a44586.scope: Deactivated successfully.
Jan 23 04:52:08 np0005593295 systemd[1]: Reloading.
Jan 23 04:52:08 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:08 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:08 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:08 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.yzflfx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:08 np0005593295 ceph-mon[75771]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Jan 23 04:52:08 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.yzflfx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:08 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:08 np0005593295 ceph-mon[75771]: Deploying daemon rgw.rgw.compute-2.yzflfx on compute-2
Jan 23 04:52:08 np0005593295 systemd[1]: Reloading.
Jan 23 04:52:08 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:08 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:09 np0005593295 systemd[1]: Starting Ceph rgw.rgw.compute-2.yzflfx for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:52:10 np0005593295 podman[82166]: 2026-01-23 09:52:10.040330037 +0000 UTC m=+0.067842353 container create acdcd7af422f7f2705bf22df75cd33bfe41cd9d5b73262aeba6616edae8c78b9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-2-yzflfx, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:52:10 np0005593295 podman[82166]: 2026-01-23 09:52:09.997316295 +0000 UTC m=+0.024828611 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:10 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97256027b59122fa8a8bc9321d1cf77dbcaa9b23d28b00f4296d4f441bda5c05/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:10 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97256027b59122fa8a8bc9321d1cf77dbcaa9b23d28b00f4296d4f441bda5c05/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:10 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97256027b59122fa8a8bc9321d1cf77dbcaa9b23d28b00f4296d4f441bda5c05/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:10 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97256027b59122fa8a8bc9321d1cf77dbcaa9b23d28b00f4296d4f441bda5c05/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.yzflfx supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:10 np0005593295 podman[82166]: 2026-01-23 09:52:10.178450059 +0000 UTC m=+0.205962405 container init acdcd7af422f7f2705bf22df75cd33bfe41cd9d5b73262aeba6616edae8c78b9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-2-yzflfx, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:52:10 np0005593295 podman[82166]: 2026-01-23 09:52:10.188741294 +0000 UTC m=+0.216253610 container start acdcd7af422f7f2705bf22df75cd33bfe41cd9d5b73262aeba6616edae8c78b9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-2-yzflfx, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:52:10 np0005593295 bash[82166]: acdcd7af422f7f2705bf22df75cd33bfe41cd9d5b73262aeba6616edae8c78b9
Jan 23 04:52:10 np0005593295 systemd[1]: Started Ceph rgw.rgw.compute-2.yzflfx for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:52:10 np0005593295 radosgw[82185]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:52:10 np0005593295 radosgw[82185]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Jan 23 04:52:10 np0005593295 radosgw[82185]: framework: beast
Jan 23 04:52:10 np0005593295 radosgw[82185]: framework conf key: endpoint, val: 192.168.122.102:8082
Jan 23 04:52:10 np0005593295 radosgw[82185]: init_numa not setting numa affinity
Jan 23 04:52:11 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:11 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:11 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:11 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.syfcuk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:11 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.syfcuk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:11 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e42 e42: 3 total, 2 up, 3 in
Jan 23 04:52:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Jan 23 04:52:15 np0005593295 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2692084146' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:52:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:15 np0005593295 ceph-mon[75771]: Deploying daemon rgw.rgw.compute-1.syfcuk on compute-1
Jan 23 04:52:16 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e43 e43: 3 total, 2 up, 3 in
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 8.114 iops: 2077.197 elapsed_sec: 1.444
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: log_channel(cluster) log [WRN] : OSD bench result of 2077.197482 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 0 waiting for initial osdmap
Jan 23 04:52:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2[81227]: 2026-01-23T09:52:17.437+0000 7f0b50c0e640 -1 osd.2 0 waiting for initial osdmap
Jan 23 04:52:17 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.102:0/2692084146' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:52:17 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:52:17 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:17 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:17 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 40 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 40 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 40 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 40 check_osdmap_features require_osd_release unknown -> squid
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 04:52:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-2[81227]: 2026-01-23T09:52:17.484+0000 7f0b4ba23640 -1 osd.2 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 43 set_numa_affinity not setting numa affinity
Jan 23 04:52:17 np0005593295 ceph-osd[81231]: osd.2 43 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Jan 23 04:52:18 np0005593295 ceph-osd[81231]: osd.2 43 tick checking mon for new map
Jan 23 04:52:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e44 e44: 3 total, 2 up, 3 in
Jan 23 04:52:19 np0005593295 ceph-mon[75771]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:52:19 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:19 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jbpfwf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:19 np0005593295 ceph-mon[75771]: OSD bench result of 2077.197482 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 04:52:19 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jbpfwf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 23 04:52:19 np0005593295 ceph-osd[81231]: osd.2 45 state: booting -> active
Jan 23 04:52:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.19( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[6.1b( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.1c( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[3.8( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.1b( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[3.1b( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.6( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.2( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[3.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[6.1( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.a( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.d( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.d( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.1d( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.c( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[7.a( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.b( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.8( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.10( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[7.14( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.14( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.13( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[2.15( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.12( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[5.13( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[7.1d( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:20 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 45 pg[4.3( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 23 04:52:21 np0005593295 ceph-mon[75771]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:52:21 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:21 np0005593295 ceph-mon[75771]: Deploying daemon rgw.rgw.compute-0.jbpfwf on compute-0
Jan 23 04:52:21 np0005593295 ceph-mon[75771]: osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776] boot
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.15( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.1d( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.14( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.b( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.10( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.14( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.8( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.c( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.d( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.0( empty local-lis/les=45/46 n=0 ec=14/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.19( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1b( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1b( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.a( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.6( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.3( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1b( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.2( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.0( empty local-lis/les=45/46 n=0 ec=16/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.d( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=45) [2] r=0 lpr=45 pi=[26,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1d( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Jan 23 04:52:22 np0005593295 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.1f( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1f( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1e( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.12( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.16( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.12( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.15( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.8( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1a( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.e( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.1f( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1c( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1f( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.1e( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.18( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.12( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[6.17( empty local-lis/les=45/46 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.15( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.16( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.11( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.9( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.12( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.8( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.b( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.9( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=45/46 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=0 lpr=46 pi=[27,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=45/46 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1a( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.e( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.15( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.1d( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=45/46 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[3.11( empty local-lis/les=45/46 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=0 lpr=46 pi=[30,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=45/46 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=0 lpr=46 pi=[24,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 23 04:52:24 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 04:52:24 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 04:52:24 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 23 04:52:25 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:25 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Jan 23 04:52:25 np0005593295 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.prgzmm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.prgzmm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:52:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 23 04:52:27 np0005593295 podman[82863]: 2026-01-23 09:52:27.848485607 +0000 UTC m=+0.042035579 container create b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 23 04:52:27 np0005593295 systemd[1]: Started libpod-conmon-b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621.scope.
Jan 23 04:52:27 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:52:27 np0005593295 podman[82863]: 2026-01-23 09:52:27.829426714 +0000 UTC m=+0.022976706 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:27 np0005593295 podman[82863]: 2026-01-23 09:52:27.952891378 +0000 UTC m=+0.146441370 container init b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:52:27 np0005593295 podman[82863]: 2026-01-23 09:52:27.960658473 +0000 UTC m=+0.154208445 container start b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 23 04:52:27 np0005593295 podman[82863]: 2026-01-23 09:52:27.964687269 +0000 UTC m=+0.158237231 container attach b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:52:27 np0005593295 vigilant_euclid[82879]: 167 167
Jan 23 04:52:27 np0005593295 systemd[1]: libpod-b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621.scope: Deactivated successfully.
Jan 23 04:52:27 np0005593295 podman[82863]: 2026-01-23 09:52:27.968332185 +0000 UTC m=+0.161882167 container died b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 04:52:28 np0005593295 systemd[1]: var-lib-containers-storage-overlay-1a1cd03d6952b18e3d893771895d0a5cc146194b0b66dbf1b838e954a3c9347d-merged.mount: Deactivated successfully.
Jan 23 04:52:28 np0005593295 podman[82863]: 2026-01-23 09:52:28.033909413 +0000 UTC m=+0.227459385 container remove b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_euclid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:52:28 np0005593295 systemd[1]: libpod-conmon-b252e74bdf91393d1ef6772b1e8edffdc86c94d2f4cc508aa66400b413fbc621.scope: Deactivated successfully.
Jan 23 04:52:28 np0005593295 systemd[1]: Reloading.
Jan 23 04:52:28 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:28 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:28 np0005593295 ceph-mon[75771]: Deploying daemon mds.cephfs.compute-2.prgzmm on compute-2
Jan 23 04:52:28 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:52:28 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:52:28 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:52:28 np0005593295 systemd[1]: Reloading.
Jan 23 04:52:28 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:28 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 23 04:52:28 np0005593295 systemd[1]: Starting Ceph mds.cephfs.compute-2.prgzmm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:52:29 np0005593295 podman[83020]: 2026-01-23 09:52:29.297214602 +0000 UTC m=+0.066349488 container create a773ba3ad4991e41d239798ec097b4bbf1907c18732274fc30c903f6eda5a6f2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:52:29 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b637cac10d75ade71c1ac4f1d68eb4ec9651c03a72730fe42c20c1d2a5060beb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:29 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b637cac10d75ade71c1ac4f1d68eb4ec9651c03a72730fe42c20c1d2a5060beb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:29 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b637cac10d75ade71c1ac4f1d68eb4ec9651c03a72730fe42c20c1d2a5060beb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:29 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b637cac10d75ade71c1ac4f1d68eb4ec9651c03a72730fe42c20c1d2a5060beb/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.prgzmm supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:29 np0005593295 podman[83020]: 2026-01-23 09:52:29.27611423 +0000 UTC m=+0.045249136 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:29 np0005593295 podman[83020]: 2026-01-23 09:52:29.518971192 +0000 UTC m=+0.288106108 container init a773ba3ad4991e41d239798ec097b4bbf1907c18732274fc30c903f6eda5a6f2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:52:29 np0005593295 podman[83020]: 2026-01-23 09:52:29.524459602 +0000 UTC m=+0.293594488 container start a773ba3ad4991e41d239798ec097b4bbf1907c18732274fc30c903f6eda5a6f2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:52:29 np0005593295 bash[83020]: a773ba3ad4991e41d239798ec097b4bbf1907c18732274fc30c903f6eda5a6f2
Jan 23 04:52:29 np0005593295 systemd[1]: Started Ceph mds.cephfs.compute-2.prgzmm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:52:29 np0005593295 ceph-mds[83039]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:52:29 np0005593295 ceph-mds[83039]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Jan 23 04:52:29 np0005593295 ceph-mds[83039]: main not setting numa affinity
Jan 23 04:52:29 np0005593295 ceph-mds[83039]: pidfile_write: ignore empty --pid-file
Jan 23 04:52:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm[83035]: starting mds.cephfs.compute-2.prgzmm at 
Jan 23 04:52:29 np0005593295 ceph-mon[75771]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:52:29 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:29 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 2 from mon.1
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:30 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e3 new map
Jan 23 04:52:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e3 print_map#012e3#012btime 2026-01-23T09:52:30:834166+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:51:34.000760+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.prgzmm{-1:24193} state up:standby seq 1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 3 from mon.1
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Monitors have assigned me to become a standby
Jan 23 04:52:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e4 new map
Jan 23 04:52:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e4 print_map#012e4#012btime 2026-01-23T09:52:31:070018+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:31.070004+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.prgzmm{0:24193} state up:creating seq 1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 4 from mon.1
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.4 handle_mds_map I am now mds.0.4
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x1
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x100
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x600
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x601
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x602
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x603
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x604
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x605
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x606
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x607
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x608
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.cache creating system inode with ino:0x609
Jan 23 04:52:31 np0005593295 ceph-mds[83039]: mds.0.4 creating_done
Jan 23 04:52:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 23 04:52:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Jan 23 04:52:31 np0005593295 ceph-mon[75771]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ymknms", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ymknms", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: Deploying daemon mds.cephfs.compute-0.ymknms on compute-0
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: daemon mds.cephfs.compute-2.prgzmm assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: Cluster is now healthy
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: daemon mds.cephfs.compute-2.prgzmm is now active in filesystem cephfs as rank 0
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e5 new map
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e5 print_map#012e5#012btime 2026-01-23T09:52:32:417167+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:32.417165+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Jan 23 04:52:32 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 5 from mon.1
Jan 23 04:52:32 np0005593295 ceph-mds[83039]: mds.0.4 handle_mds_map I am now mds.0.4
Jan 23 04:52:32 np0005593295 ceph-mds[83039]: mds.0.4 handle_mds_map state change up:creating --> up:active
Jan 23 04:52:32 np0005593295 ceph-mds[83039]: mds.0.4 recovery_done -- successful recovery!
Jan 23 04:52:32 np0005593295 ceph-mds[83039]: mds.0.4 active_start
Jan 23 04:52:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 23 04:52:33 np0005593295 ceph-mon[75771]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:52:33 np0005593295 ceph-mon[75771]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:52:33 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:52:33 np0005593295 ceph-mon[75771]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:52:33 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e6 new map
Jan 23 04:52:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e6 print_map#012e6#012btime 2026-01-23T09:52:33:487599+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:32.417165+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:34 np0005593295 radosgw[82185]: v1 topic migration: starting v1 topic migration..
Jan 23 04:52:34 np0005593295 radosgw[82185]: LDAP not started since no server URIs were provided in the configuration.
Jan 23 04:52:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-2-yzflfx[82181]: 2026-01-23T09:52:34.009+0000 7f834d527980 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 23 04:52:34 np0005593295 radosgw[82185]: v1 topic migration: finished v1 topic migration
Jan 23 04:52:34 np0005593295 radosgw[82185]: framework: beast
Jan 23 04:52:34 np0005593295 radosgw[82185]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 23 04:52:34 np0005593295 radosgw[82185]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 23 04:52:34 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593295 radosgw[82185]: starting handler: beast
Jan 23 04:52:34 np0005593295 radosgw[82185]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:52:34 np0005593295 radosgw[82185]: mgrc service_daemon_register rgw.24181 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.yzflfx,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=75d0a494-c738-4cca-b87e-be71cfd0ed45,zone_name=default,zonegroup_id=6635d7c3-d02c-4c4b-90b3-4ee042e293d6,zonegroup_name=default}
Jan 23 04:52:34 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 23 04:52:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 23 04:52:34 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 04:52:35 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:35 np0005593295 ceph-mon[75771]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:52:35 np0005593295 ceph-mon[75771]: Cluster is now healthy
Jan 23 04:52:35 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 23 04:52:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:35 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bcvzvj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bcvzvj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:52:36 np0005593295 ceph-mon[75771]: Deploying daemon mds.cephfs.compute-1.bcvzvj on compute-1
Jan 23 04:52:37 np0005593295 ceph-mds[83039]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Jan 23 04:52:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-2-prgzmm[83035]: 2026-01-23T09:52:37.079+0000 7f6dd35e5640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:37 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 23 04:52:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e7 new map
Jan 23 04:52:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e7 print_map#012e7#012btime 2026-01-23T09:52:38:529421+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:32.417165+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:52:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.16( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.17( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.16( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.3( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.2( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.8( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.9( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.5( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.1c( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.f( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.d( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.b( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.a( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.11( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.3( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.b( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.15( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.6( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.7( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.5( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.18( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.1f( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.13( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[9.9( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 59 pg[8.c( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: Creating key for client.nfs.cephfs.0.0.compute-1.bawllm
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e8 new map
Jan 23 04:52:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e8 print_map#012e8#012btime 2026-01-23T09:52:40:798611+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:39.805778+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 4 join_fscid=1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:40 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm Updating MDS map to version 8 from mon.1
Jan 23 04:52:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.16( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=59/60 n=1 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.15( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.3( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.18( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.1f( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.c( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.5( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.6( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.2( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.7( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.11( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.b( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=59/60 n=1 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.1c( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.9( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.f( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.d( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [2] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 60 pg[8.a( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [2] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 23 04:52:41 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 23 04:52:41 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 04:52:41 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:41 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:41 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e9 new map
Jan 23 04:52:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).mds e9 print_map#012e9#012btime 2026-01-23T09:52:42:200523+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:51:34.000760+0000#012modified#0112026-01-23T09:52:39.805778+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24193}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24193 members: 24193#012[mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 4 join_fscid=1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 04:52:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 23 04:52:42 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Jan 23 04:52:42 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Jan 23 04:52:43 np0005593295 ceph-mon[75771]: Rados config object exists: conf-nfs.cephfs
Jan 23 04:52:43 np0005593295 ceph-mon[75771]: Creating key for client.nfs.cephfs.0.0.compute-1.bawllm-rgw
Jan 23 04:52:43 np0005593295 ceph-mon[75771]: Bind address in nfs.cephfs.0.0.compute-1.bawllm's ganesha conf is defaulting to empty
Jan 23 04:52:43 np0005593295 ceph-mon[75771]: Deploying daemon nfs.cephfs.0.0.compute-1.bawllm on compute-1
Jan 23 04:52:43 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:52:43 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Jan 23 04:52:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 23 04:52:43 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Jan 23 04:52:44 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Jan 23 04:52:44 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:52:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:45 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.13 deep-scrub starts
Jan 23 04:52:45 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.13 deep-scrub ok
Jan 23 04:52:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.11( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.13( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.13( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.1( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.7( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.9( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.15( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.3( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.2( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.5( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.3( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.4( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.1e( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.1a( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.18( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.17( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[10.11( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[12.1d( empty local-lis/les=0/0 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.16( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.13( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.17( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.e( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.8( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.a( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.3( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 63 pg[11.19( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:46 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Jan 23 04:52:46 np0005593295 ceph-mon[75771]: Creating key for client.nfs.cephfs.1.0.compute-2.tykohi
Jan 23 04:52:46 np0005593295 ceph-mon[75771]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Jan 23 04:52:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.5( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.5( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.1( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.13( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.13( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.11( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.11( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.3( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.3( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.15( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[10.15( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=-1 lpr=64 pi=[59,64)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.a( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.3( v 60'51 lc 50'38 (0'0,60'51] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=60'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.13( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.e( v 60'51 lc 50'26 (0'0,60'51] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=60'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.16( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.8( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.17( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[11.19( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [2] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.1d( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.1a( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.1e( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.3( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.2( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.11( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.7( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.9( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.4( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.17( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.18( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 64 pg[12.13( empty local-lis/les=63/64 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63) [2] r=0 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Jan 23 04:52:47 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Jan 23 04:52:47 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:52:47 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 23 04:52:47 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:52:47 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 23 04:52:47 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: Rados config object exists: conf-nfs.cephfs
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: Creating key for client.nfs.cephfs.1.0.compute-2.tykohi-rgw
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: Bind address in nfs.cephfs.1.0.compute-2.tykohi's ganesha conf is defaulting to empty
Jan 23 04:52:48 np0005593295 ceph-mon[75771]: Deploying daemon nfs.cephfs.1.0.compute-2.tykohi on compute-2
Jan 23 04:52:49 np0005593295 podman[83196]: 2026-01-23 09:52:49.586198175 +0000 UTC m=+0.063662444 container create ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Jan 23 04:52:49 np0005593295 systemd[1]: Started libpod-conmon-ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc.scope.
Jan 23 04:52:49 np0005593295 podman[83196]: 2026-01-23 09:52:49.565456892 +0000 UTC m=+0.042921181 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:49 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:52:49 np0005593295 podman[83196]: 2026-01-23 09:52:49.69498555 +0000 UTC m=+0.172449839 container init ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 23 04:52:49 np0005593295 podman[83196]: 2026-01-23 09:52:49.701372151 +0000 UTC m=+0.178836420 container start ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 23 04:52:49 np0005593295 upbeat_albattani[83212]: 167 167
Jan 23 04:52:49 np0005593295 systemd[1]: libpod-ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc.scope: Deactivated successfully.
Jan 23 04:52:49 np0005593295 conmon[83212]: conmon ff1841fbb3ce48b7c39f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc.scope/container/memory.events
Jan 23 04:52:49 np0005593295 podman[83196]: 2026-01-23 09:52:49.710081198 +0000 UTC m=+0.187545467 container attach ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:52:49 np0005593295 podman[83196]: 2026-01-23 09:52:49.710513619 +0000 UTC m=+0.187977888 container died ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:52:49 np0005593295 systemd[1]: var-lib-containers-storage-overlay-e08e1c1620331f43b94956035d7db3bc164be881adfcde6834723e5e11a23a0a-merged.mount: Deactivated successfully.
Jan 23 04:52:49 np0005593295 podman[83196]: 2026-01-23 09:52:49.756064311 +0000 UTC m=+0.233528580 container remove ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_albattani, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:52:49 np0005593295 systemd[1]: libpod-conmon-ff1841fbb3ce48b7c39f1e29f26c7d0c10e4dcaa4f9e7f89f7a925aa8a4705cc.scope: Deactivated successfully.
Jan 23 04:52:49 np0005593295 systemd[1]: Reloading.
Jan 23 04:52:49 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:49 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=0/0 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=60'756 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=0/0 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=60'756 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'766 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'766 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=58'754 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'768 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'768 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'769 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'769 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 luod=0'0 crt=62'765 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:50 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 66 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'765 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:50 np0005593295 systemd[1]: Reloading.
Jan 23 04:52:50 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:52:50 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:52:50 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:52:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e66 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:50 np0005593295 podman[83351]: 2026-01-23 09:52:50.63684899 +0000 UTC m=+0.034766827 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:52:50 np0005593295 podman[83351]: 2026-01-23 09:52:50.734288286 +0000 UTC m=+0.132206103 container create 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:52:50 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:50 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:50 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:50 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:52:50 np0005593295 podman[83351]: 2026-01-23 09:52:50.811589363 +0000 UTC m=+0.209507200 container init 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:52:50 np0005593295 podman[83351]: 2026-01-23 09:52:50.816870638 +0000 UTC m=+0.214788455 container start 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:52:50 np0005593295 bash[83351]: 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d
Jan 23 04:52:50 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:52:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:52:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:52:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:52:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:52:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:52:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:52:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:52:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:52:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 04:52:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 04:52:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:52:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:52:51 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 23 04:52:51 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:51 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:51 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:51 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 04:52:51 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 04:52:51 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'761 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'761 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'768 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=0/0 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'768 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=60'756 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=66/67 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=65'770 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=58'754 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=66/67 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'768 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=66/67 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'769 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 67 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=66/67 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=62'765 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.17 scrub starts
Jan 23 04:52:51 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.17 scrub ok
Jan 23 04:52:52 np0005593295 ceph-mon[75771]: Creating key for client.nfs.cephfs.2.0.compute-0.fenqiu
Jan 23 04:52:52 np0005593295 ceph-mon[75771]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Jan 23 04:52:52 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 04:52:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 23 04:52:52 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'761 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:52 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:52 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'768 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:52 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:52 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 68 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67) [2] r=0 lpr=67 pi=[59,67)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:52:52 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Jan 23 04:52:52 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Jan 23 04:52:53 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.5 scrub starts
Jan 23 04:52:53 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.5 scrub ok
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:52:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:52:54 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Jan 23 04:52:54 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Jan 23 04:52:54 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 04:52:54 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 04:52:54 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:52:54 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:52:55 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Jan 23 04:52:55 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Jan 23 04:52:55 np0005593295 ceph-mon[75771]: Rados config object exists: conf-nfs.cephfs
Jan 23 04:52:55 np0005593295 ceph-mon[75771]: Creating key for client.nfs.cephfs.2.0.compute-0.fenqiu-rgw
Jan 23 04:52:55 np0005593295 ceph-mon[75771]: Bind address in nfs.cephfs.2.0.compute-0.fenqiu's ganesha conf is defaulting to empty
Jan 23 04:52:55 np0005593295 ceph-mon[75771]: Deploying daemon nfs.cephfs.2.0.compute-0.fenqiu on compute-0
Jan 23 04:52:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:52:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:52:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:52:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:52:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:52:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:52:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:52:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:52:56 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.5 deep-scrub starts
Jan 23 04:52:56 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.5 deep-scrub ok
Jan 23 04:52:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:52:57 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.2 deep-scrub starts
Jan 23 04:52:57 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.2 deep-scrub ok
Jan 23 04:52:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 23 04:52:57 np0005593295 ceph-mon[75771]: Deploying daemon haproxy.nfs.cephfs.compute-1.mnxlgm on compute-1
Jan 23 04:52:57 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 23 04:52:58 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Jan 23 04:52:58 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Jan 23 04:52:58 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 23 04:52:59 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Jan 23 04:52:59 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Jan 23 04:52:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 23 04:52:59 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 23 04:53:00 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 70 pg[10.14( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70) [2] r=0 lpr=70 pi=[59,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:00 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 70 pg[10.4( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70) [2] r=0 lpr=70 pi=[59,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:00 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 70 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70) [2] r=0 lpr=70 pi=[59,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:00 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 70 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70) [2] r=0 lpr=70 pi=[59,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:00 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.b scrub starts
Jan 23 04:53:00 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.b scrub ok
Jan 23 04:53:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:01 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 23 04:53:01 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.4( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.4( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.14( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 71 pg[10.14( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=-1 lpr=71 pi=[59,71)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.c scrub starts
Jan 23 04:53:01 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.c scrub ok
Jan 23 04:53:02 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.5 deep-scrub starts
Jan 23 04:53:02 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.5 deep-scrub ok
Jan 23 04:53:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:02 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6854000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 23 04:53:03 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:03 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:03 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 23 04:53:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:03 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 73 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:04 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Jan 23 04:53:04 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Jan 23 04:53:04 np0005593295 ceph-mon[75771]: Deploying daemon haproxy.nfs.cephfs.compute-0.yeogal on compute-0
Jan 23 04:53:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 23 04:53:04 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 74 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:04 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 74 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:04 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 74 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=73/74 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=72'768 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:04 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 74 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73) [2] r=0 lpr=73 pi=[59,73)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:04 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:05 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.b deep-scrub starts
Jan 23 04:53:05 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.b deep-scrub ok
Jan 23 04:53:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:05 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:06 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.f scrub starts
Jan 23 04:53:06 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.f scrub ok
Jan 23 04:53:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:06 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:07 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Jan 23 04:53:07 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Jan 23 04:53:08 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 23 04:53:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.869682312s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'764 mlcod 0'0 active pruub 82.272300720s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.869649887s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 82.272300720s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=66/67 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=75 pruub=14.722684860s) [1] r=-1 lpr=75 pi=[66,75)/1 crt=67'771 lcod 67'772 mlcod 67'772 active pruub 81.125633240s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=66/67 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=75 pruub=14.722566605s) [1] r=-1 lpr=75 pi=[66,75)/1 crt=67'771 lcod 67'772 mlcod 0'0 unknown NOTIFY pruub 81.125633240s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.868231773s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'768 mlcod 0'0 active pruub 82.272315979s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.868209839s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'768 mlcod 0'0 unknown NOTIFY pruub 82.272315979s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.867843628s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'759 mlcod 0'0 active pruub 82.272323608s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 75 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=75 pruub=15.867819786s) [1] r=-1 lpr=75 pi=[67,75)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 82.272323608s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'764 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'764 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=66/67 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=76) [1]/[2] r=0 lpr=76 pi=[66,76)/1 crt=67'771 lcod 67'772 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'768 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=67/68 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'768 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=66/67 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=76) [1]/[2] r=0 lpr=76 pi=[66,76)/1 crt=67'771 lcod 67'772 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 76 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] r=0 lpr=76 pi=[67,76)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Jan 23 04:53:08 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Jan 23 04:53:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:08 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:09 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6848001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:09 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 23 04:53:09 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:09 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:09 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:09 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 23 04:53:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 23 04:53:09 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 77 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=76/77 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] async=[1] r=0 lpr=76 pi=[67,76)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:09 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 77 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=76/77 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] async=[1] r=0 lpr=76 pi=[67,76)/1 crt=62'764 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:09 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 77 pg[10.5( v 68'773 (0'0,68'773] local-lis/les=76/77 n=8 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=76) [1]/[2] async=[1] r=0 lpr=76 pi=[66,76)/1 crt=68'773 lcod 67'772 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:09 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 77 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=76/77 n=6 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=76) [1]/[2] async=[1] r=0 lpr=76 pi=[67,76)/1 crt=62'768 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:10 np0005593295 ceph-mon[75771]: Deploying daemon haproxy.nfs.cephfs.compute-2.bbaqsj on compute-2
Jan 23 04:53:10 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 23 04:53:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 23 04:53:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:10 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:10 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=76/77 n=6 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.722089767s) [1] async=[1] r=-1 lpr=78 pi=[67,78)/1 crt=62'768 mlcod 62'768 active pruub 83.543327332s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:10 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=76/77 n=5 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721847534s) [1] async=[1] r=-1 lpr=78 pi=[67,78)/1 crt=62'759 mlcod 62'759 active pruub 83.543106079s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:10 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=76/77 n=5 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721715927s) [1] r=-1 lpr=78 pi=[67,78)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 83.543106079s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:10 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.5( v 77'776 (0'0,77'776] local-lis/les=76/77 n=8 ec=59/46 lis/c=76/66 les/c/f=77/67/0 sis=78 pruub=14.721512794s) [1] async=[1] r=-1 lpr=78 pi=[66,78)/1 crt=68'773 lcod 77'775 mlcod 77'775 active pruub 83.543243408s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:10 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.5( v 77'776 (0'0,77'776] local-lis/les=76/77 n=8 ec=59/46 lis/c=76/66 les/c/f=77/67/0 sis=78 pruub=14.721419334s) [1] r=-1 lpr=78 pi=[66,78)/1 crt=68'773 lcod 77'775 mlcod 0'0 unknown NOTIFY pruub 83.543243408s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:10 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=76/77 n=6 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721409798s) [1] r=-1 lpr=78 pi=[67,78)/1 crt=62'768 mlcod 0'0 unknown NOTIFY pruub 83.543327332s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:10 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=76/77 n=6 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721117973s) [1] async=[1] r=-1 lpr=78 pi=[67,78)/1 crt=62'764 mlcod 62'764 active pruub 83.543182373s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:10 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 78 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=76/77 n=6 ec=59/46 lis/c=76/67 les/c/f=77/68/0 sis=78 pruub=14.721064568s) [1] r=-1 lpr=78 pi=[67,78)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 83.543182373s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:11 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 23 04:53:11 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 23 04:53:11 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=79 pruub=12.436098099s) [0] r=-1 lpr=79 pi=[67,79)/1 crt=62'761 mlcod 0'0 active pruub 82.265121460s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:11 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=79 pruub=12.436048508s) [0] r=-1 lpr=79 pi=[67,79)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 82.265121460s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:11 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.296006203s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'759 mlcod 0'0 active pruub 81.125587463s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:11 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.295859337s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 81.125587463s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:11 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.292468071s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'759 mlcod 0'0 active pruub 81.122230530s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:11 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.292396545s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 81.122230530s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:11 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.295536041s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'771 mlcod 0'0 active pruub 81.125885010s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:11 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 79 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79 pruub=11.295515060s) [0] r=-1 lpr=79 pi=[66,79)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 81.125885010s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Jan 23 04:53:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:12 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:12 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 23 04:53:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=66/67 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] r=0 lpr=80 pi=[67,80)/1 crt=62'761 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=67/68 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] r=0 lpr=80 pi=[67,80)/1 crt=62'761 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=66/67 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:12 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 80 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:13 np0005593295 podman[83515]: 2026-01-23 09:53:13.267203207 +0000 UTC m=+3.757022439 container create 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 04:53:13 np0005593295 podman[83515]: 2026-01-23 09:53:13.250488626 +0000 UTC m=+3.740307878 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 04:53:13 np0005593295 systemd[1]: Started libpod-conmon-3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb.scope.
Jan 23 04:53:13 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:53:13 np0005593295 podman[83515]: 2026-01-23 09:53:13.343659 +0000 UTC m=+3.833478242 container init 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 04:53:13 np0005593295 podman[83515]: 2026-01-23 09:53:13.352290342 +0000 UTC m=+3.842109584 container start 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 04:53:13 np0005593295 podman[83515]: 2026-01-23 09:53:13.356417349 +0000 UTC m=+3.846236601 container attach 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 04:53:13 np0005593295 boring_nightingale[83631]: 0 0
Jan 23 04:53:13 np0005593295 systemd[1]: libpod-3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb.scope: Deactivated successfully.
Jan 23 04:53:13 np0005593295 conmon[83631]: conmon 3467097f00d36d1528b6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb.scope/container/memory.events
Jan 23 04:53:13 np0005593295 podman[83515]: 2026-01-23 09:53:13.359301316 +0000 UTC m=+3.849120548 container died 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 04:53:13 np0005593295 systemd[1]: var-lib-containers-storage-overlay-3013f51abecae5064cc04fa32df82e14c54fabed3889a2cee404031f0085f126-merged.mount: Deactivated successfully.
Jan 23 04:53:13 np0005593295 podman[83515]: 2026-01-23 09:53:13.41063739 +0000 UTC m=+3.900456622 container remove 3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb (image=quay.io/ceph/haproxy:2.3, name=boring_nightingale)
Jan 23 04:53:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:13 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:13 np0005593295 systemd[1]: libpod-conmon-3467097f00d36d1528b6f66c7922dcbfd805c3cef88ae48dd17d1d2fe65cfccb.scope: Deactivated successfully.
Jan 23 04:53:13 np0005593295 systemd[1]: Reloading.
Jan 23 04:53:13 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1b scrub starts
Jan 23 04:53:13 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1b scrub ok
Jan 23 04:53:13 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:13 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:13 np0005593295 systemd[1]: Reloading.
Jan 23 04:53:13 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:13 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:14 np0005593295 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.bbaqsj for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:53:14 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 23 04:53:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 23 04:53:14 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 81 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=80/81 n=3 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] async=[0] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:14 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 81 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=80/81 n=7 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] async=[0] r=0 lpr=80 pi=[66,80)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:14 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 81 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=80/81 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] async=[0] r=0 lpr=80 pi=[66,80)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:14 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 81 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=80/81 n=5 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] async=[0] r=0 lpr=80 pi=[67,80)/1 crt=62'761 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:14 np0005593295 podman[83775]: 2026-01-23 09:53:14.307550782 +0000 UTC m=+0.040058120 container create c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 04:53:14 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/687fed0fbddf5b7be24ce0985318e75d8f4a36ccd1305243ec9ed7a638372074/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 23 04:53:14 np0005593295 podman[83775]: 2026-01-23 09:53:14.363312679 +0000 UTC m=+0.095820027 container init c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 04:53:14 np0005593295 podman[83775]: 2026-01-23 09:53:14.368408849 +0000 UTC m=+0.100916187 container start c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 04:53:14 np0005593295 bash[83775]: c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26
Jan 23 04:53:14 np0005593295 podman[83775]: 2026-01-23 09:53:14.290696437 +0000 UTC m=+0.023203795 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 04:53:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [NOTICE] 022/095314 (2) : New worker #1 (4) forked
Jan 23 04:53:14 np0005593295 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.bbaqsj for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:53:14 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.19 scrub starts
Jan 23 04:53:14 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.19 scrub ok
Jan 23 04:53:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:14 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:15 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:15 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 23 04:53:15 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82 pruub=14.999643326s) [0] async=[0] r=-1 lpr=82 pi=[67,82)/1 crt=62'761 mlcod 62'761 active pruub 88.190757751s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82 pruub=14.999567986s) [0] r=-1 lpr=82 pi=[67,82)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 88.190757751s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.997087479s) [0] async=[0] r=-1 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 62'759 active pruub 88.190704346s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.996816635s) [0] r=-1 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 88.190704346s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=80/81 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.996311188s) [0] async=[0] r=-1 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 62'759 active pruub 88.190559387s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=80/81 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.996195793s) [0] r=-1 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 88.190559387s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=80/81 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.996054649s) [0] async=[0] r=-1 lpr=82 pi=[66,82)/1 crt=62'771 mlcod 62'771 active pruub 88.190666199s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 82 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=80/81 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82 pruub=14.995622635s) [0] r=-1 lpr=82 pi=[66,82)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 88.190666199s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:15 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.d scrub starts
Jan 23 04:53:15 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.d scrub ok
Jan 23 04:53:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:16 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 23 04:53:16 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 04:53:16 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:53:16 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:53:16 np0005593295 ceph-mon[75771]: Deploying daemon keepalived.nfs.cephfs.compute-1.vcrquf on compute-1
Jan 23 04:53:16 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.a scrub starts
Jan 23 04:53:16 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 8.a scrub ok
Jan 23 04:53:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:16 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:17 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 23 04:53:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:17 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:17 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Jan 23 04:53:17 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Jan 23 04:53:18 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.a scrub starts
Jan 23 04:53:18 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.a scrub ok
Jan 23 04:53:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:18 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68280016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 23 04:53:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:19 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480032d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:19 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6840001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:19 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Jan 23 04:53:19 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: Deploying daemon keepalived.nfs.cephfs.compute-0.lrsdkc on compute-0
Jan 23 04:53:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 23 04:53:19 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 86 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=86 pruub=12.381774902s) [0] r=-1 lpr=86 pi=[67,86)/1 crt=62'771 mlcod 0'0 active pruub 90.272521973s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:19 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 86 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=86 pruub=12.381703377s) [0] r=-1 lpr=86 pi=[67,86)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 90.272521973s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:19 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 86 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=86 pruub=11.234715462s) [0] r=-1 lpr=86 pi=[66,86)/1 crt=58'754 mlcod 0'0 active pruub 89.125755310s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:19 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 86 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=86 pruub=11.234684944s) [0] r=-1 lpr=86 pi=[66,86)/1 crt=58'754 mlcod 0'0 unknown NOTIFY pruub 89.125755310s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:20 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Jan 23 04:53:20 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Jan 23 04:53:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:20 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:21 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:21 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 23 04:53:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 23 04:53:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 87 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] r=0 lpr=87 pi=[67,87)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 87 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=67/68 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] r=0 lpr=87 pi=[67,87)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 87 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] r=0 lpr=87 pi=[66,87)/1 crt=58'754 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:21 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 87 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=66/67 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] r=0 lpr=87 pi=[66,87)/1 crt=58'754 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:21 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480032d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:21 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Jan 23 04:53:21 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Jan 23 04:53:22 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 23 04:53:22 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 23 04:53:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 23 04:53:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 88 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=87/88 n=7 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] async=[0] r=0 lpr=87 pi=[67,87)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:22 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 88 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=87/88 n=4 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] async=[0] r=0 lpr=87 pi=[66,87)/1 crt=58'754 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:22 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Jan 23 04:53:22 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Jan 23 04:53:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:22 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480032d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:23 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=87/88 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89 pruub=14.983925819s) [0] async=[0] r=-1 lpr=89 pi=[66,89)/1 crt=58'754 mlcod 58'754 active pruub 96.317939758s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=87/88 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89 pruub=14.983826637s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=58'754 mlcod 0'0 unknown NOTIFY pruub 96.317939758s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=87/88 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89 pruub=14.983630180s) [0] async=[0] r=-1 lpr=89 pi=[67,89)/1 crt=62'771 mlcod 62'771 active pruub 96.317901611s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89 pruub=15.791469574s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=60'756 mlcod 0'0 active pruub 97.125907898s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89 pruub=15.791426659s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=60'756 mlcod 0'0 unknown NOTIFY pruub 97.125907898s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=87/88 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89 pruub=14.983448982s) [0] r=-1 lpr=89 pi=[67,89)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 96.317901611s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89 pruub=15.790423393s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=62'759 mlcod 0'0 active pruub 97.126426697s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 89 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89 pruub=15.790397644s) [0] r=-1 lpr=89 pi=[66,89)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 97.126426697s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:23 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 23 04:53:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:23 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Jan 23 04:53:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 90 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=0 lpr=90 pi=[66,90)/1 crt=60'756 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 90 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=66/67 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=0 lpr=90 pi=[66,90)/1 crt=60'756 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 90 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=0 lpr=90 pi=[66,90)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:23 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 90 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=0 lpr=90 pi=[66,90)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:24 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Jan 23 04:53:24 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Jan 23 04:53:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:24 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480032d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:25 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:25 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 23 04:53:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 23 04:53:25 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 91 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=90/91 n=2 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] async=[0] r=0 lpr=90 pi=[66,90)/1 crt=60'756 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:25 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 91 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=90/91 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] async=[0] r=0 lpr=90 pi=[66,90)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:25 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:25 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.2 scrub starts
Jan 23 04:53:25 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.2 scrub ok
Jan 23 04:53:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:26 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.3 scrub starts
Jan 23 04:53:26 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.3 scrub ok
Jan 23 04:53:26 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:26 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:26 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:26 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:26 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 23 04:53:27 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 92 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=90/91 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92 pruub=14.362304688s) [0] async=[0] r=-1 lpr=92 pi=[66,92)/1 crt=60'756 mlcod 60'756 active pruub 99.363479614s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:27 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 92 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=90/91 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92 pruub=14.362131119s) [0] r=-1 lpr=92 pi=[66,92)/1 crt=60'756 mlcod 0'0 unknown NOTIFY pruub 99.363479614s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:27 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 92 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=90/91 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92 pruub=14.364026070s) [0] async=[0] r=-1 lpr=92 pi=[66,92)/1 crt=62'759 mlcod 62'759 active pruub 99.365730286s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:27 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 92 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=90/91 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92 pruub=14.363988876s) [0] r=-1 lpr=92 pi=[66,92)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 99.365730286s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:27 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68480047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:27 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:27 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1a scrub starts
Jan 23 04:53:27 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.1a scrub ok
Jan 23 04:53:28 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.9 scrub starts
Jan 23 04:53:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:28 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:29 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:29 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.9 scrub ok
Jan 23 04:53:29 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Jan 23 04:53:29 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Jan 23 04:53:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:29 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:29 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:53:29 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:53:29 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 04:53:29 np0005593295 ceph-mon[75771]: Deploying daemon keepalived.nfs.cephfs.compute-2.pawaai on compute-2
Jan 23 04:53:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 23 04:53:30 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.7 scrub starts
Jan 23 04:53:30 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.7 scrub ok
Jan 23 04:53:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:30 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:30 np0005593295 podman[83893]: 2026-01-23 09:53:30.897199248 +0000 UTC m=+4.719883928 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 04:53:30 np0005593295 podman[83893]: 2026-01-23 09:53:30.913604303 +0000 UTC m=+4.736288963 container create 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.28.2, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, description=keepalived for Ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, version=2.2.4, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 04:53:30 np0005593295 systemd[1]: Started libpod-conmon-26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943.scope.
Jan 23 04:53:30 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:53:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:31 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:31 np0005593295 podman[83893]: 2026-01-23 09:53:31.232819155 +0000 UTC m=+5.055503815 container init 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, distribution-scope=public, name=keepalived, io.openshift.expose-services=, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, description=keepalived for Ceph, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, architecture=x86_64, release=1793, vendor=Red Hat, Inc.)
Jan 23 04:53:31 np0005593295 podman[83893]: 2026-01-23 09:53:31.241743674 +0000 UTC m=+5.064428334 container start 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, io.openshift.expose-services=, io.buildah.version=1.28.2)
Jan 23 04:53:31 np0005593295 podman[83893]: 2026-01-23 09:53:31.246241989 +0000 UTC m=+5.068926669 container attach 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., name=keepalived, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 04:53:31 np0005593295 elastic_blackburn[83991]: 0 0
Jan 23 04:53:31 np0005593295 systemd[1]: libpod-26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943.scope: Deactivated successfully.
Jan 23 04:53:31 np0005593295 podman[83893]: 2026-01-23 09:53:31.24928636 +0000 UTC m=+5.071971020 container died 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, description=keepalived for Ceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, vcs-type=git, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.buildah.version=1.28.2, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20)
Jan 23 04:53:31 np0005593295 systemd[1]: var-lib-containers-storage-overlay-99bd1a1218ddcc9282978193efca963c2b61900d03748cb8bf171d2e03c61cfb-merged.mount: Deactivated successfully.
Jan 23 04:53:31 np0005593295 podman[83893]: 2026-01-23 09:53:31.29662684 +0000 UTC m=+5.119311670 container remove 26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943 (image=quay.io/ceph/keepalived:2.2.4, name=elastic_blackburn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, name=keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, com.redhat.component=keepalived-container, release=1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 04:53:31 np0005593295 systemd[1]: libpod-conmon-26719e2240007fa1f7b99de8994dafe111adcda7d392677253f701311a437943.scope: Deactivated successfully.
Jan 23 04:53:31 np0005593295 systemd[1]: Reloading.
Jan 23 04:53:31 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.19 deep-scrub starts
Jan 23 04:53:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:31 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:31 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.19 deep-scrub ok
Jan 23 04:53:31 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:31 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:31 np0005593295 systemd[1]: Reloading.
Jan 23 04:53:31 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:31 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:31 np0005593295 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.pawaai for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:53:32 np0005593295 podman[84139]: 2026-01-23 09:53:32.140566301 +0000 UTC m=+0.045486847 container create 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, release=1793, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.28.2, version=2.2.4, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc.)
Jan 23 04:53:32 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12f2c7cd8a6b52dfd2865263f41bdb1091228e98f9902071822660490c39b1ea/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:53:32 np0005593295 podman[84139]: 2026-01-23 09:53:32.20748226 +0000 UTC m=+0.112402806 container init 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.buildah.version=1.28.2, version=2.2.4, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, architecture=x86_64, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., distribution-scope=public, name=keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 23 04:53:32 np0005593295 podman[84139]: 2026-01-23 09:53:32.212909747 +0000 UTC m=+0.117830293 container start 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, vcs-type=git, io.openshift.expose-services=, version=2.2.4, build-date=2023-02-22T09:23:20, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, architecture=x86_64, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., release=1793, name=keepalived)
Jan 23 04:53:32 np0005593295 podman[84139]: 2026-01-23 09:53:32.121522255 +0000 UTC m=+0.026442821 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 04:53:32 np0005593295 bash[84139]: 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa
Jan 23 04:53:32 np0005593295 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.pawaai for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Starting VRRP child process, pid=4
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: Startup complete
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: (VI_0) Entering BACKUP STATE (init)
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:32 2026: VRRP_Script(check_backend) succeeded
Jan 23 04:53:32 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Jan 23 04:53:32 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Jan 23 04:53:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:32 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:32 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:32 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:32 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:32 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:32 np0005593295 ceph-mon[75771]: Deploying daemon alertmanager.compute-0 on compute-0
Jan 23 04:53:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:33 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:33 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Jan 23 04:53:33 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Jan 23 04:53:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:33 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:34 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Jan 23 04:53:34 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Jan 23 04:53:34 np0005593295 systemd-logind[786]: New session 36 of user zuul.
Jan 23 04:53:34 np0005593295 systemd[1]: Started Session 36 of User zuul.
Jan 23 04:53:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:34 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:35 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:35 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Jan 23 04:53:35 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Jan 23 04:53:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:35 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:35 np0005593295 python3.9[84317]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:53:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:35 2026: (VI_0) Entering MASTER STATE
Jan 23 04:53:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:35 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 23 04:53:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:53:35 2026: (VI_0) Entering BACKUP STATE
Jan 23 04:53:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 23 04:53:36 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 94 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94 pruub=8.604924202s) [0] r=-1 lpr=94 pi=[73,94)/1 crt=62'764 mlcod 0'0 active pruub 102.616806030s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:36 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 94 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94 pruub=8.604826927s) [0] r=-1 lpr=94 pi=[73,94)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 102.616806030s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:36 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 94 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94 pruub=8.607663155s) [0] r=-1 lpr=94 pi=[73,94)/1 crt=62'764 mlcod 0'0 active pruub 102.620361328s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:36 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 94 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94 pruub=8.607626915s) [0] r=-1 lpr=94 pi=[73,94)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 102.620361328s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 23 04:53:36 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:36 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Jan 23 04:53:36 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Jan 23 04:53:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:36 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:37 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:37 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.e scrub starts
Jan 23 04:53:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:37 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:37 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 11.e scrub ok
Jan 23 04:53:38 np0005593295 python3.9[84535]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:53:38 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.4 deep-scrub starts
Jan 23 04:53:38 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.4 deep-scrub ok
Jan 23 04:53:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 23 04:53:38 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 23 04:53:38 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:38 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 95 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:38 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 95 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:38 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 95 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:38 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 95 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=73/74 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:38 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:39 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:39 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:39 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: Regenerating cephadm self-signed grafana TLS certificates
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: Deploying daemon grafana.compute-0 on compute-0
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 04:53:39 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Jan 23 04:53:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 23 04:53:39 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 96 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=6 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] async=[0] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:39 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 96 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=7 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] async=[0] r=0 lpr=95 pi=[73,95)/1 crt=62'764 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:40 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Jan 23 04:53:40 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Jan 23 04:53:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:40 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 23 04:53:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 04:53:40 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 04:53:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 97 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97 pruub=14.590447426s) [0] async=[0] r=-1 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 62'764 active pruub 113.512527466s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 97 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97 pruub=14.590225220s) [0] r=-1 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 113.512527466s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 97 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97 pruub=14.589948654s) [0] async=[0] r=-1 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 62'764 active pruub 113.512481689s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 97 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=95/96 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97 pruub=14.589887619s) [0] r=-1 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 unknown NOTIFY pruub 113.512481689s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:41 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:41 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:41 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 23 04:53:41 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.313344) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022313535, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7502, "num_deletes": 257, "total_data_size": 18548443, "memory_usage": 19292720, "flush_reason": "Manual Compaction"}
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022450002, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11454841, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 251, "largest_seqno": 7507, "table_properties": {"data_size": 11424610, "index_size": 19300, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9861, "raw_key_size": 96717, "raw_average_key_size": 24, "raw_value_size": 11348944, "raw_average_value_size": 2879, "num_data_blocks": 851, "num_entries": 3941, "num_filter_entries": 3941, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 1769161840, "file_creation_time": 1769162022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 136751 microseconds, and 62105 cpu microseconds.
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.450157) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11454841 bytes OK
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.450189) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.454502) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.454555) EVENT_LOG_v1 {"time_micros": 1769162022454545, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.454587) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18506772, prev total WAL file size 18506772, number of live WAL files 2.
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.458996) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(1648B)]
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022459184, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11456489, "oldest_snapshot_seqno": -1}
Jan 23 04:53:42 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Jan 23 04:53:42 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3687 keys, 11451043 bytes, temperature: kUnknown
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022769767, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11451043, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11421450, "index_size": 19243, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 92435, "raw_average_key_size": 25, "raw_value_size": 11349002, "raw_average_value_size": 3078, "num_data_blocks": 850, "num_entries": 3687, "num_filter_entries": 3687, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.770193) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11451043 bytes
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.772602) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 36.9 rd, 36.9 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.9, 0.0 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3946, records dropped: 259 output_compression: NoCompression
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.772623) EVENT_LOG_v1 {"time_micros": 1769162022772613, "job": 4, "event": "compaction_finished", "compaction_time_micros": 310747, "compaction_time_cpu_micros": 226693, "output_level": 6, "num_output_files": 1, "total_output_size": 11451043, "num_input_records": 3946, "num_output_records": 3687, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022774460, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022774519, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 23 04:53:42 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:53:42.458750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:42 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:43 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:43 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 23 04:53:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 23 04:53:43 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Jan 23 04:53:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:43 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:43 np0005593295 ceph-osd[81231]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Jan 23 04:53:44 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 23 04:53:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 23 04:53:44 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 100 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100) [2] r=0 lpr=100 pi=[82,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:44 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 100 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100) [2] r=0 lpr=100 pi=[82,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:44 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:45 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 23 04:53:45 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 101 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[82,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:45 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 101 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[82,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:45 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 101 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[82,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:45 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 101 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[82,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:45 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:45 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 23 04:53:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:46 np0005593295 systemd[77796]: Starting Mark boot as successful...
Jan 23 04:53:46 np0005593295 systemd[77796]: Finished Mark boot as successful.
Jan 23 04:53:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:46 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:47 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 23 04:53:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:47 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:48 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:49 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:49 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:50 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:51 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:51 np0005593295 ceph-mds[83039]: mds.beacon.cephfs.compute-2.prgzmm missed beacon ack from the monitors
Jan 23 04:53:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:52 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:53 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:53 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 23 04:53:53 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 04:53:53 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 04:53:53 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 04:53:53 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.10( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=103) [2] r=0 lpr=103 pi=[59,103)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:53 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 luod=0'0 crt=62'761 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:53 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 crt=62'761 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:53 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:53 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 103 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 23 04:53:54 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 04:53:54 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 04:53:54 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 04:53:54 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 104 pg[10.10( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] r=-1 lpr=104 pi=[59,104)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:54 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 104 pg[10.10( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] r=-1 lpr=104 pi=[59,104)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:54 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 104 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 crt=62'761 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:54 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 104 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=103/104 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103) [2] r=0 lpr=103 pi=[82,103)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:54 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:55 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:55 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6834003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 23 04:53:55 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 23 04:53:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:53:56 np0005593295 systemd[1]: session-36.scope: Deactivated successfully.
Jan 23 04:53:56 np0005593295 systemd[1]: session-36.scope: Consumed 9.093s CPU time.
Jan 23 04:53:56 np0005593295 systemd-logind[786]: Session 36 logged out. Waiting for processes to exit.
Jan 23 04:53:56 np0005593295 systemd-logind[786]: Removed session 36.
Jan 23 04:53:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 23 04:53:56 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 106 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=0/0 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106) [2] r=0 lpr=106 pi=[59,106)/1 luod=0'0 crt=58'754 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:56 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 106 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=0/0 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106) [2] r=0 lpr=106 pi=[59,106)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 23 04:53:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:56 np0005593295 ceph-mon[75771]: Deploying daemon haproxy.rgw.default.compute-0.qabsws on compute-0
Jan 23 04:53:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:56 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:57 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:57 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 23 04:53:57 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 107 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=106/107 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106) [2] r=0 lpr=106 pi=[59,106)/1 crt=58'754 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:53:57 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 23 04:53:57 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 23 04:53:58 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 107 pg[10.12( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=107) [2] r=0 lpr=107 pi=[67,107)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:53:58 np0005593295 podman[84701]: 2026-01-23 09:53:58.715158748 +0000 UTC m=+0.047490704 container create 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 04:53:58 np0005593295 systemd[1]: Started libpod-conmon-924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414.scope.
Jan 23 04:53:58 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:53:58 np0005593295 podman[84701]: 2026-01-23 09:53:58.696130132 +0000 UTC m=+0.028462108 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 04:53:58 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:58 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:58 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:53:58 np0005593295 ceph-mon[75771]: Deploying daemon haproxy.rgw.default.compute-2.izjwnk on compute-2
Jan 23 04:53:58 np0005593295 podman[84701]: 2026-01-23 09:53:58.804553103 +0000 UTC m=+0.136885069 container init 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 04:53:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:58 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:58 np0005593295 podman[84701]: 2026-01-23 09:53:58.81381672 +0000 UTC m=+0.146148666 container start 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 04:53:58 np0005593295 podman[84701]: 2026-01-23 09:53:58.818558422 +0000 UTC m=+0.150890468 container attach 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 04:53:58 np0005593295 inspiring_carver[84717]: 0 0
Jan 23 04:53:58 np0005593295 systemd[1]: libpod-924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414.scope: Deactivated successfully.
Jan 23 04:53:58 np0005593295 conmon[84717]: conmon 924e5abd11de385a333b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414.scope/container/memory.events
Jan 23 04:53:58 np0005593295 podman[84701]: 2026-01-23 09:53:58.821987902 +0000 UTC m=+0.154319848 container died 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 04:53:58 np0005593295 systemd[1]: var-lib-containers-storage-overlay-6772238725dedc4f7395fc6b6d506a03e7015afb4a207bd32baa989f5a107db7-merged.mount: Deactivated successfully.
Jan 23 04:53:58 np0005593295 podman[84701]: 2026-01-23 09:53:58.866946675 +0000 UTC m=+0.199278621 container remove 924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414 (image=quay.io/ceph/haproxy:2.3, name=inspiring_carver)
Jan 23 04:53:58 np0005593295 systemd[1]: libpod-conmon-924e5abd11de385a333b10a1167f3684d7ebea6c25638e8382d0a50af0ae4414.scope: Deactivated successfully.
Jan 23 04:53:58 np0005593295 systemd[1]: Reloading.
Jan 23 04:53:59 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:59 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:59 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6854000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 23 04:53:59 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 108 pg[10.12( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=108) [2]/[1] r=-1 lpr=108 pi=[67,108)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:59 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 108 pg[10.12( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=108) [2]/[1] r=-1 lpr=108 pi=[67,108)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:59 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 108 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=108 pruub=12.026422501s) [1] r=-1 lpr=108 pi=[66,108)/1 crt=62'759 mlcod 0'0 active pruub 129.126800537s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:53:59 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 108 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=108 pruub=12.026377678s) [1] r=-1 lpr=108 pi=[66,108)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 129.126800537s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:53:59 np0005593295 systemd[1]: Reloading.
Jan 23 04:53:59 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:53:59 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:53:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:53:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.004000091s ======
Jan 23 04:53:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:59.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000091s
Jan 23 04:53:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:53:59 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:53:59 np0005593295 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.izjwnk for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:53:59 np0005593295 podman[84861]: 2026-01-23 09:53:59.758323588 +0000 UTC m=+0.041665028 container create c0d2fdbe15736053ab1c3c44f6e122e8a046bd96739d6c407e37736b0b1b24d0 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-rgw-default-compute-2-izjwnk)
Jan 23 04:53:59 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73775bdafb1314bd2f9de495c4be70618a2025b8e05554445d1ca65eb394d1b9/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 23 04:53:59 np0005593295 podman[84861]: 2026-01-23 09:53:59.808172416 +0000 UTC m=+0.091513866 container init c0d2fdbe15736053ab1c3c44f6e122e8a046bd96739d6c407e37736b0b1b24d0 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-rgw-default-compute-2-izjwnk)
Jan 23 04:53:59 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 23 04:53:59 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 23 04:53:59 np0005593295 podman[84861]: 2026-01-23 09:53:59.813107832 +0000 UTC m=+0.096449272 container start c0d2fdbe15736053ab1c3c44f6e122e8a046bd96739d6c407e37736b0b1b24d0 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-rgw-default-compute-2-izjwnk)
Jan 23 04:53:59 np0005593295 bash[84861]: c0d2fdbe15736053ab1c3c44f6e122e8a046bd96739d6c407e37736b0b1b24d0
Jan 23 04:53:59 np0005593295 podman[84861]: 2026-01-23 09:53:59.740627913 +0000 UTC m=+0.023969373 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 04:53:59 np0005593295 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.izjwnk for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:53:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-rgw-default-compute-2-izjwnk[84876]: [NOTICE] 022/095359 (2) : New worker #1 (4) forked
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 23 04:54:00 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 109 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=109) [1]/[2] r=0 lpr=109 pi=[66,109)/1 crt=62'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:00 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 109 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=66/67 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=109) [1]/[2] r=0 lpr=109 pi=[66,109)/1 crt=62'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095400 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:00 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:54:00 np0005593295 ceph-mon[75771]: Deploying daemon keepalived.rgw.default.compute-0.tytkrd on compute-0
Jan 23 04:54:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:01 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f68400034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 23 04:54:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:01.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 110 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=0/0 n=4 ec=59/46 lis/c=108/67 les/c/f=109/68/0 sis=110) [2] r=0 lpr=110 pi=[67,110)/1 luod=0'0 crt=61'760 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 110 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=0/0 n=4 ec=59/46 lis/c=108/67 les/c/f=109/68/0 sis=110) [2] r=0 lpr=110 pi=[67,110)/1 crt=61'760 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:01 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 110 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=109/110 n=5 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=109) [1]/[2] async=[1] r=0 lpr=109 pi=[66,109)/1 crt=62'759 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:01.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:01 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6854001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:01 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 23 04:54:02 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 111 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=109/110 n=5 ec=59/46 lis/c=109/66 les/c/f=110/67/0 sis=111 pruub=14.992680550s) [1] async=[1] r=-1 lpr=111 pi=[66,111)/1 crt=62'759 mlcod 62'759 active pruub 135.129196167s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:02 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 111 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=109/110 n=5 ec=59/46 lis/c=109/66 les/c/f=110/67/0 sis=111 pruub=14.992507935s) [1] r=-1 lpr=111 pi=[66,111)/1 crt=62'759 mlcod 0'0 unknown NOTIFY pruub 135.129196167s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:02 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 111 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=110/111 n=4 ec=59/46 lis/c=108/67 les/c/f=109/68/0 sis=110) [2] r=0 lpr=110 pi=[67,110)/1 crt=61'760 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:02 np0005593295 podman[84982]: 2026-01-23 09:54:02.60883274 +0000 UTC m=+0.040942471 container create 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, name=keepalived, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, distribution-scope=public, vcs-type=git, release=1793)
Jan 23 04:54:02 np0005593295 systemd[1]: Started libpod-conmon-3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e.scope.
Jan 23 04:54:02 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:54:02 np0005593295 podman[84982]: 2026-01-23 09:54:02.590273225 +0000 UTC m=+0.022382976 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 04:54:02 np0005593295 podman[84982]: 2026-01-23 09:54:02.699053084 +0000 UTC m=+0.131162835 container init 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, name=keepalived, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793)
Jan 23 04:54:02 np0005593295 podman[84982]: 2026-01-23 09:54:02.706161651 +0000 UTC m=+0.138271382 container start 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, version=2.2.4, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, name=keepalived, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph)
Jan 23 04:54:02 np0005593295 podman[84982]: 2026-01-23 09:54:02.70996336 +0000 UTC m=+0.142073121 container attach 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, io.buildah.version=1.28.2, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, architecture=x86_64, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vendor=Red Hat, Inc.)
Jan 23 04:54:02 np0005593295 gallant_lamport[84998]: 0 0
Jan 23 04:54:02 np0005593295 systemd[1]: libpod-3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e.scope: Deactivated successfully.
Jan 23 04:54:02 np0005593295 conmon[84998]: conmon 3f5566e36fa0d56d11b5 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e.scope/container/memory.events
Jan 23 04:54:02 np0005593295 podman[84982]: 2026-01-23 09:54:02.7154741 +0000 UTC m=+0.147583821 container died 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vcs-type=git, release=1793, version=2.2.4, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 23 04:54:02 np0005593295 systemd[1]: var-lib-containers-storage-overlay-98917a16612cec2a1fa02256d3a7e0a6490ab36aade52f28bc04c21f202e332b-merged.mount: Deactivated successfully.
Jan 23 04:54:02 np0005593295 podman[84982]: 2026-01-23 09:54:02.75983384 +0000 UTC m=+0.191943571 container remove 3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e (image=quay.io/ceph/keepalived:2.2.4, name=gallant_lamport, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived)
Jan 23 04:54:02 np0005593295 systemd[1]: libpod-conmon-3f5566e36fa0d56d11b5a23c744c2017dc2977a64b2ba140c154bddcb5386e8e.scope: Deactivated successfully.
Jan 23 04:54:02 np0005593295 systemd[1]: Reloading.
Jan 23 04:54:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:02 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f682c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:02 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:54:02 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:54:03 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:03 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:03 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:03 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:54:03 np0005593295 ceph-mon[75771]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:54:03 np0005593295 ceph-mon[75771]: Deploying daemon keepalived.rgw.default.compute-2.qpmsjd on compute-2
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[83366]: 23/01/2026 09:54:03 : epoch 697344f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6828003c10 fd 39 proxy ignored for local
Jan 23 04:54:03 np0005593295 kernel: ganesha.nfsd[83423]: segfault at 50 ip 00007f68d4c9f32e sp 00007f683dffa210 error 4 in libntirpc.so.5.8[7f68d4c84000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 23 04:54:03 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 04:54:03 np0005593295 systemd[1]: Created slice Slice /system/systemd-coredump.
Jan 23 04:54:03 np0005593295 systemd[1]: Started Process Core Dump (PID 85055/UID 0).
Jan 23 04:54:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 04:54:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:03.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 04:54:03 np0005593295 systemd[1]: Reloading.
Jan 23 04:54:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 23 04:54:03 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:54:03 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:54:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:03.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:03 np0005593295 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.qpmsjd for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:54:03 np0005593295 podman[85150]: 2026-01-23 09:54:03.72194761 +0000 UTC m=+0.052456781 container create 4079f34468022a0cd827d5befc6f77a5139d567959582991ace2a2e3465960c7 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, release=1793, io.buildah.version=1.28.2, distribution-scope=public, io.openshift.expose-services=, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 23 04:54:03 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c3c97516a3697cea5613434af93e4104ed25eb6e4dd09966eed88fc66d25fc3/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:03 np0005593295 podman[85150]: 2026-01-23 09:54:03.784577937 +0000 UTC m=+0.115087108 container init 4079f34468022a0cd827d5befc6f77a5139d567959582991ace2a2e3465960c7 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd, release=1793, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, name=keepalived, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, vcs-type=git, vendor=Red Hat, Inc.)
Jan 23 04:54:03 np0005593295 podman[85150]: 2026-01-23 09:54:03.698641043 +0000 UTC m=+0.029150234 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 04:54:03 np0005593295 podman[85150]: 2026-01-23 09:54:03.807072525 +0000 UTC m=+0.137581716 container start 4079f34468022a0cd827d5befc6f77a5139d567959582991ace2a2e3465960c7 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, release=1793, distribution-scope=public, name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 23 04:54:03 np0005593295 bash[85150]: 4079f34468022a0cd827d5befc6f77a5139d567959582991ace2a2e3465960c7
Jan 23 04:54:03 np0005593295 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.qpmsjd for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Failed to bind to process monitoring socket - errno 98 - Address already in use
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Starting VRRP child process, pid=4
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: (VI_0) Entering BACKUP STATE (init)
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: Startup complete
Jan 23 04:54:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:03 2026: VRRP_Script(check_backend) succeeded
Jan 23 04:54:04 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:04 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:04 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:04 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:05.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 23 04:54:05 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 113 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=113 pruub=11.421886444s) [1] r=-1 lpr=113 pi=[73,113)/1 crt=62'771 mlcod 0'0 active pruub 134.621353149s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:05 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 113 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=113 pruub=11.421828270s) [1] r=-1 lpr=113 pi=[73,113)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 134.621353149s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:05 np0005593295 ceph-mon[75771]: Deploying daemon prometheus.compute-0 on compute-0
Jan 23 04:54:05 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 23 04:54:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:05.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:06 np0005593295 systemd-coredump[85058]: Process 83370 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 53:#012#0  0x00007f68d4c9f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 04:54:06 np0005593295 systemd[1]: systemd-coredump@0-85055-0.service: Deactivated successfully.
Jan 23 04:54:06 np0005593295 systemd[1]: systemd-coredump@0-85055-0.service: Consumed 2.925s CPU time.
Jan 23 04:54:06 np0005593295 podman[85178]: 2026-01-23 09:54:06.234439932 +0000 UTC m=+0.031010408 container died 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 04:54:06 np0005593295 systemd[1]: var-lib-containers-storage-overlay-2d0f8416f7052c607630e33d06ff2a3ec2436d092e8cfcdd013926939d221c79-merged.mount: Deactivated successfully.
Jan 23 04:54:06 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 23 04:54:06 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:06 np0005593295 podman[85178]: 2026-01-23 09:54:06.278073787 +0000 UTC m=+0.074644243 container remove 25dd11a4fcdbe97d844db7d4fb971576b159944b07e8d21ef2be7d36d99ebd7d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:54:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 23 04:54:06 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 04:54:06 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 114 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=114) [1]/[2] r=0 lpr=114 pi=[73,114)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:06 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 114 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=73/74 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=114) [1]/[2] r=0 lpr=114 pi=[73,114)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:06 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 04:54:06 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.236s CPU time.
Jan 23 04:54:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:07.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:07 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 23 04:54:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 23 04:54:07 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 115 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=114/115 n=5 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=114) [1]/[2] async=[1] r=0 lpr=114 pi=[73,114)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:07.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:08 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 23 04:54:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 23 04:54:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 116 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=114/115 n=5 ec=59/46 lis/c=114/73 les/c/f=115/74/0 sis=116 pruub=14.343958855s) [1] async=[1] r=-1 lpr=116 pi=[73,116)/1 crt=62'771 mlcod 62'771 active pruub 141.281097412s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:08 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 116 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=114/115 n=5 ec=59/46 lis/c=114/73 les/c/f=115/74/0 sis=116 pruub=14.343770981s) [1] r=-1 lpr=116 pi=[73,116)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 141.281097412s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:09.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:09.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 23 04:54:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095411 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:54:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:11.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:11.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:12 np0005593295 systemd-logind[786]: New session 37 of user zuul.
Jan 23 04:54:12 np0005593295 systemd[1]: Started Session 37 of User zuul.
Jan 23 04:54:12 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:12 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:12 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:12 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  1: '-n'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  2: 'mgr.compute-2.uczrot'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  3: '-f'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  4: '--setuser'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  5: 'ceph'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  6: '--setgroup'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  7: 'ceph'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr respawn  exe_path /proc/self/exe
Jan 23 04:54:12 np0005593295 systemd[1]: session-34.scope: Deactivated successfully.
Jan 23 04:54:12 np0005593295 systemd[1]: session-34.scope: Consumed 30.610s CPU time.
Jan 23 04:54:12 np0005593295 systemd-logind[786]: Session 34 logged out. Waiting for processes to exit.
Jan 23 04:54:12 np0005593295 systemd-logind[786]: Removed session 34.
Jan 23 04:54:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setuser ceph since I am not root
Jan 23 04:54:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: ignoring --setgroup ceph since I am not root
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: pidfile_write: ignore empty --pid-file
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'alerts'
Jan 23 04:54:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:12.553+0000 7f4b4bc8d140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'balancer'
Jan 23 04:54:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:12.669+0000 7f4b4bc8d140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:54:12 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'cephadm'
Jan 23 04:54:12 np0005593295 python3.9[85401]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 04:54:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:13.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:13.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:13 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'crash'
Jan 23 04:54:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:13.663+0000 7f4b4bc8d140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:54:13 np0005593295 ceph-mgr[76120]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:54:13 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'dashboard'
Jan 23 04:54:14 np0005593295 python3.9[85586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:14.490+0000 7f4b4bc8d140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]:  from numpy import show_config as show_numpy_config
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:14.722+0000 7f4b4bc8d140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'influx'
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:14.802+0000 7f4b4bc8d140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'insights'
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'iostat'
Jan 23 04:54:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:14.951+0000 7f4b4bc8d140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:54:14 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:54:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:15.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:15.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:15 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'localpool'
Jan 23 04:54:15 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:54:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:15 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'mirroring'
Jan 23 04:54:15 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'nfs'
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.156+0000 7f4b4bc8d140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.427+0000 7f4b4bc8d140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.514+0000 7f4b4bc8d140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'osd_support'
Jan 23 04:54:16 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 1.
Jan 23 04:54:16 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:54:16 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.236s CPU time.
Jan 23 04:54:16 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.590021) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056590296, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1003, "num_deletes": 251, "total_data_size": 2240482, "memory_usage": 2276192, "flush_reason": "Manual Compaction"}
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056603909, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1428848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7512, "largest_seqno": 8510, "table_properties": {"data_size": 1424071, "index_size": 2301, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10217, "raw_average_key_size": 18, "raw_value_size": 1414138, "raw_average_value_size": 2566, "num_data_blocks": 102, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162023, "oldest_key_time": 1769162023, "file_creation_time": 1769162056, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 13925 microseconds, and 7524 cpu microseconds.
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.604024) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1428848 bytes OK
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.604066) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.606036) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.606065) EVENT_LOG_v1 {"time_micros": 1769162056606060, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.606092) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 2235274, prev total WAL file size 2235274, number of live WAL files 2.
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.607893) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1395KB)], [15(10MB)]
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056608130, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12879891, "oldest_snapshot_seqno": -1}
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.610+0000 7f4b4bc8d140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.715+0000 7f4b4bc8d140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'progress'
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3711 keys, 12440754 bytes, temperature: kUnknown
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056721302, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12440754, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12410095, "index_size": 20309, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 94976, "raw_average_key_size": 25, "raw_value_size": 12336134, "raw_average_value_size": 3324, "num_data_blocks": 879, "num_entries": 3711, "num_filter_entries": 3711, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162056, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.721590) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12440754 bytes
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.723256) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 113.7 rd, 109.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.9 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(17.7) write-amplify(8.7) OK, records in: 4238, records dropped: 527 output_compression: NoCompression
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.723277) EVENT_LOG_v1 {"time_micros": 1769162056723267, "job": 6, "event": "compaction_finished", "compaction_time_micros": 113237, "compaction_time_cpu_micros": 39837, "output_level": 6, "num_output_files": 1, "total_output_size": 12440754, "num_input_records": 4238, "num_output_records": 3711, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056723560, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056725314, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.607012) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725685) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725694) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:54:16.725696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:54:16 np0005593295 podman[85715]: 2026-01-23 09:54:16.802754113 +0000 UTC m=+0.043612065 container create 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:16.814+0000 7f4b4bc8d140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:54:16 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'prometheus'
Jan 23 04:54:16 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:16 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:16 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:16 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:16 np0005593295 podman[85715]: 2026-01-23 09:54:16.871433248 +0000 UTC m=+0.112291220 container init 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:54:16 np0005593295 podman[85715]: 2026-01-23 09:54:16.876567866 +0000 UTC m=+0.117425818 container start 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:54:16 np0005593295 podman[85715]: 2026-01-23 09:54:16.781342555 +0000 UTC m=+0.022200527 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:16 np0005593295 bash[85715]: 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:54:16 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:54:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:54:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:17.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:17.284+0000 7f4b4bc8d140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593295 ceph-mgr[76120]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:54:17 np0005593295 python3.9[85847]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:54:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:17.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:17.426+0000 7f4b4bc8d140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593295 ceph-mgr[76120]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'restful'
Jan 23 04:54:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:17 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rgw'
Jan 23 04:54:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:17.965+0000 7f4b4bc8d140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593295 ceph-mgr[76120]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:54:17 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'rook'
Jan 23 04:54:18 np0005593295 python3.9[86002]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:54:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:18.730+0000 7f4b4bc8d140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593295 ceph-mgr[76120]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'selftest'
Jan 23 04:54:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:18.815+0000 7f4b4bc8d140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593295 ceph-mgr[76120]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:54:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:18.935+0000 7f4b4bc8d140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593295 ceph-mgr[76120]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:54:18 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'stats'
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'status'
Jan 23 04:54:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:19.126+0000 7f4b4bc8d140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telegraf'
Jan 23 04:54:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:19.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:19.209+0000 7f4b4bc8d140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'telemetry'
Jan 23 04:54:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:19.396+0000 7f4b4bc8d140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:54:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:54:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:19.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:19 np0005593295 python3.9[86156]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:54:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:19.669+0000 7f4b4bc8d140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:54:19 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'volumes'
Jan 23 04:54:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 23 04:54:19 np0005593295 ceph-mon[75771]: Active manager daemon compute-0.nbdygh restarted
Jan 23 04:54:19 np0005593295 ceph-mon[75771]: Activating manager daemon compute-0.nbdygh
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:20.020+0000 7f4b4bc8d140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: mgr[py] Loading python module 'zabbix'
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 2026-01-23T09:54:20.111+0000 7f4b4bc8d140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: [23/Jan/2026:09:54:20] ENGINE Bus STARTING
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: CherryPy Checker:
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: The Application mounted at '' has an empty config.
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: 
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: mgr load Constructed class from module: dashboard
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: mgr load Constructed class from module: prometheus
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [prometheus INFO root] server_addr: :: server_port: 9283
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [prometheus INFO root] Starting engine...
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: ms_deliver_dispatch: unhandled message 0x55f67dd3d860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:20] ENGINE Bus STARTING
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Starting engine...
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [dashboard INFO root] Engine started...
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: [23/Jan/2026:09:54:20] ENGINE Serving on http://:::9283
Jan 23 04:54:20 np0005593295 python3.9[86309]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:20] ENGINE Serving on http://:::9283
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-2-uczrot[76116]: [23/Jan/2026:09:54:20] ENGINE Bus STARTED
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:20] ENGINE Bus STARTED
Jan 23 04:54:20 np0005593295 ceph-mgr[76120]: [prometheus INFO root] Engine started.
Jan 23 04:54:20 np0005593295 systemd-logind[786]: New session 38 of user ceph-admin.
Jan 23 04:54:20 np0005593295 systemd[1]: Started Session 38 of User ceph-admin.
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:54:20 np0005593295 ceph-mon[75771]: Manager daemon compute-0.nbdygh is now available
Jan 23 04:54:20 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:20 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:20 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 04:54:20 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 04:54:21 np0005593295 python3.9[86575]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:54:21 np0005593295 podman[86606]: 2026-01-23 09:54:21.171609132 +0000 UTC m=+0.068906523 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:54:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:21.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:21 np0005593295 network[86643]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:54:21 np0005593295 network[86644]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:54:21 np0005593295 network[86645]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:54:21 np0005593295 podman[86606]: 2026-01-23 09:54:21.310197912 +0000 UTC m=+0.207495303 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:54:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:21.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095421 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:54:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [NOTICE] 022/095421 (4) : haproxy version is 2.3.17-d1c9119
Jan 23 04:54:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [NOTICE] 022/095421 (4) : path to executable is /usr/local/sbin/haproxy
Jan 23 04:54:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [ALERT] 022/095421 (4) : backend 'backend' has no server available!
Jan 23 04:54:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 23 04:54:22 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Bus STARTING
Jan 23 04:54:22 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Serving on http://192.168.122.100:8765
Jan 23 04:54:22 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Serving on https://192.168.122.100:7150
Jan 23 04:54:22 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Bus STARTED
Jan 23 04:54:22 np0005593295 ceph-mon[75771]: [23/Jan/2026:09:54:21] ENGINE Client ('192.168.122.100', 52034) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 04:54:22 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 23 04:54:22 np0005593295 podman[86744]: 2026-01-23 09:54:22.433648593 +0000 UTC m=+0.188800595 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:22 np0005593295 podman[86744]: 2026-01-23 09:54:22.449241818 +0000 UTC m=+0.204393790 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 04:54:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:22 np0005593295 podman[86884]: 2026-01-23 09:54:22.919283244 +0000 UTC m=+0.062377393 container exec 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:54:22 np0005593295 podman[86884]: 2026-01-23 09:54:22.927761498 +0000 UTC m=+0.070855627 container exec_died 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:23 np0005593295 podman[86961]: 2026-01-23 09:54:23.156045962 +0000 UTC m=+0.055562928 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 04:54:23 np0005593295 podman[86961]: 2026-01-23 09:54:23.17130775 +0000 UTC m=+0.070824726 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 04:54:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:23.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:23.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:23 np0005593295 podman[87040]: 2026-01-23 09:54:23.452055412 +0000 UTC m=+0.104688969 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, release=1793, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived)
Jan 23 04:54:23 np0005593295 podman[87040]: 2026-01-23 09:54:23.468142279 +0000 UTC m=+0.120775866 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, description=keepalived for Ceph, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container)
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 04:54:23 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:54:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:24 np0005593295 python3.9[87416]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:54:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 23 04:54:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 04:54:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:25.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 04:54:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 23 04:54:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:25.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:25 np0005593295 python3.9[87586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:54:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:26 np0005593295 podman[87705]: 2026-01-23 09:54:26.155658955 +0000 UTC m=+0.049354126 container create c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:54:26 np0005593295 systemd[1]: Started libpod-conmon-c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b.scope.
Jan 23 04:54:26 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:54:26 np0005593295 podman[87705]: 2026-01-23 09:54:26.134014612 +0000 UTC m=+0.027709813 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:26 np0005593295 podman[87705]: 2026-01-23 09:54:26.237211034 +0000 UTC m=+0.130906235 container init c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 23 04:54:26 np0005593295 podman[87705]: 2026-01-23 09:54:26.24886941 +0000 UTC m=+0.142564591 container start c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True)
Jan 23 04:54:26 np0005593295 podman[87705]: 2026-01-23 09:54:26.253659229 +0000 UTC m=+0.147354540 container attach c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 04:54:26 np0005593295 brave_swartz[87721]: 167 167
Jan 23 04:54:26 np0005593295 systemd[1]: libpod-c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b.scope: Deactivated successfully.
Jan 23 04:54:26 np0005593295 podman[87705]: 2026-01-23 09:54:26.258896849 +0000 UTC m=+0.152592030 container died c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:54:26 np0005593295 systemd[1]: var-lib-containers-storage-overlay-2d9b606e8a127c8f73b9b0b674ec1b55e1e503f2327fddc498db858558e8ada5-merged.mount: Deactivated successfully.
Jan 23 04:54:26 np0005593295 podman[87705]: 2026-01-23 09:54:26.303814833 +0000 UTC m=+0.197510004 container remove c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_swartz, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 23 04:54:26 np0005593295 systemd[1]: libpod-conmon-c60165b7bc290c13207dc0c2554021ab5cbc6270c9c0ab3d0b1db948d94bd07b.scope: Deactivated successfully.
Jan 23 04:54:26 np0005593295 podman[87744]: 2026-01-23 09:54:26.45896831 +0000 UTC m=+0.043420501 container create 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:54:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095426 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:54:26 np0005593295 systemd[1]: Started libpod-conmon-373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981.scope.
Jan 23 04:54:26 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:54:26 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:26 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:26 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:26 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:26 np0005593295 podman[87744]: 2026-01-23 09:54:26.439288281 +0000 UTC m=+0.023740482 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:26 np0005593295 podman[87744]: 2026-01-23 09:54:26.540220282 +0000 UTC m=+0.124672493 container init 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 23 04:54:26 np0005593295 podman[87744]: 2026-01-23 09:54:26.601060829 +0000 UTC m=+0.185513010 container start 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 04:54:26 np0005593295 podman[87744]: 2026-01-23 09:54:26.604898447 +0000 UTC m=+0.189350658 container attach 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:54:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 23 04:54:27 np0005593295 python3.9[87892]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:54:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:27.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:27.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:27 np0005593295 funny_kirch[87792]: [
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:    {
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "available": false,
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "being_replaced": false,
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "ceph_device_lvm": false,
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "lsm_data": {},
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "lvs": [],
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "path": "/dev/sr0",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "rejected_reasons": [
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "Insufficient space (<5GB)",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "Has a FileSystem"
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        ],
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        "sys_api": {
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "actuators": null,
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "device_nodes": [
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:                "sr0"
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            ],
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "devname": "sr0",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "human_readable_size": "482.00 KB",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "id_bus": "ata",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "model": "QEMU DVD-ROM",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "nr_requests": "2",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "parent": "/dev/sr0",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "partitions": {},
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "path": "/dev/sr0",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "removable": "1",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "rev": "2.5+",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "ro": "0",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "rotational": "1",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "sas_address": "",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "sas_device_handle": "",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "scheduler_mode": "mq-deadline",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "sectors": 0,
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "sectorsize": "2048",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "size": 493568.0,
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "support_discard": "2048",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "type": "disk",
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:            "vendor": "QEMU"
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:        }
Jan 23 04:54:27 np0005593295 funny_kirch[87792]:    }
Jan 23 04:54:27 np0005593295 funny_kirch[87792]: ]
Jan 23 04:54:27 np0005593295 systemd[1]: libpod-373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981.scope: Deactivated successfully.
Jan 23 04:54:27 np0005593295 podman[87744]: 2026-01-23 09:54:27.561931015 +0000 UTC m=+1.146383216 container died 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 23 04:54:27 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 23 04:54:27 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:27 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:27 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:27 np0005593295 systemd[1]: var-lib-containers-storage-overlay-e39c4c5866c61c4e61303a94c9e6fdf8d183c645523a3b89450f5a9d8d95c649-merged.mount: Deactivated successfully.
Jan 23 04:54:27 np0005593295 podman[87744]: 2026-01-23 09:54:27.67579049 +0000 UTC m=+1.260242671 container remove 373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_kirch, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 04:54:27 np0005593295 systemd[1]: libpod-conmon-373756fe3f1f248643a306321abb7932a9e79ed1266276e0cb584ecb8157f981.scope: Deactivated successfully.
Jan 23 04:54:28 np0005593295 python3.9[89237]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:54:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:54:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 23 04:54:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:29.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:29.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000005:nfs.cephfs.1: -2
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:54:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:54:29 np0005593295 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:54:29 np0005593295 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 04:54:29 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 23 04:54:29 np0005593295 ceph-mon[75771]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:54:29 np0005593295 ceph-mon[75771]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:54:29 np0005593295 python3.9[89619]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:54:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 23 04:54:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:30 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:30 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:54:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 23 04:54:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0240016e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:31.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:31.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:31 np0005593295 ceph-mon[75771]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 04:54:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 04:54:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 04:54:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:54:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 23 04:54:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:32 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:32 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 23 04:54:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 23 04:54:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095433 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:54:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:33.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:33.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:54:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:54:34 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 23 04:54:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 23 04:54:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:34 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018001b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:35 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 23 04:54:35 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 23 04:54:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:35.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 04:54:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:35.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 04:54:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 23 04:54:36 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 23 04:54:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:54:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000044s ======
Jan 23 04:54:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:37.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000044s
Jan 23 04:54:37 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:37 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:37 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:37 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 23 04:54:37 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:54:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 23 04:54:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:37.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018001b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:38 np0005593295 ceph-mon[75771]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 23 04:54:38 np0005593295 ceph-mon[75771]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 23 04:54:38 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 23 04:54:38 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:38 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:38 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.nbdygh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:54:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:38 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:39 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:39.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:39 np0005593295 ceph-mon[75771]: Reconfiguring mgr.compute-0.nbdygh (monmap changed)...
Jan 23 04:54:39 np0005593295 ceph-mon[75771]: Reconfiguring daemon mgr.compute-0.nbdygh on compute-0
Jan 23 04:54:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:54:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 23 04:54:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:39.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:39 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 23 04:54:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:39 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 132 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=132) [2] r=0 lpr=132 pi=[80,132)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: Reconfiguring osd.1 (monmap changed)...
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: Reconfiguring daemon osd.1 on compute-0
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 23 04:54:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 133 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] r=-1 lpr=133 pi=[80,133)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:40 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 133 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] r=-1 lpr=133 pi=[80,133)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:40 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018001b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:41 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:41.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:41.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:41 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:41 np0005593295 ceph-mon[75771]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Jan 23 04:54:41 np0005593295 ceph-mon[75771]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Jan 23 04:54:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:54:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 23 04:54:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 134 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=134 pruub=8.646332741s) [0] r=-1 lpr=134 pi=[103,134)/1 crt=62'761 mlcod 0'0 active pruub 168.579345703s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:41 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 134 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=134 pruub=8.646267891s) [0] r=-1 lpr=134 pi=[103,134)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 168.579345703s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:42 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:43 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 23 04:54:43 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 135 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] r=0 lpr=135 pi=[103,135)/1 crt=62'761 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:43 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 135 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=103/104 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] r=0 lpr=135 pi=[103,135)/1 crt=62'761 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:43 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 135 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=0/0 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135) [2] r=0 lpr=135 pi=[80,135)/1 luod=0'0 crt=62'763 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:43 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 135 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=0/0 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135) [2] r=0 lpr=135 pi=[80,135)/1 crt=62'763 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:54:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:43.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:43.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:43 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095443 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:54:43 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:54:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:44 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 23 04:54:45 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 136 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=135/136 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135) [2] r=0 lpr=135 pi=[80,135)/1 crt=62'763 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:45 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:45.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:45.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:45 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:45 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 136 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=135/136 n=5 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] async=[0] r=0 lpr=135 pi=[103,135)/1 crt=62'761 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:54:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 23 04:54:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 137 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=135/136 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137 pruub=15.589160919s) [0] async=[0] r=-1 lpr=137 pi=[103,137)/1 crt=62'761 mlcod 62'761 active pruub 179.562911987s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 04:54:46 np0005593295 ceph-osd[81231]: osd.2 pg_epoch: 137 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=135/136 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137 pruub=15.589061737s) [0] r=-1 lpr=137 pi=[103,137)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 179.562911987s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:54:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:46 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:46 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:46 np0005593295 ceph-mon[75771]: Reconfiguring grafana.compute-0 (dependencies changed)...
Jan 23 04:54:46 np0005593295 ceph-mon[75771]: Reconfiguring daemon grafana.compute-0 on compute-0
Jan 23 04:54:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:46 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 23 04:54:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:47.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:47.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c009ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:48 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:49 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:49.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:49 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:49 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:49 np0005593295 ceph-mon[75771]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 23 04:54:49 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:54:49 np0005593295 ceph-mon[75771]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 23 04:54:49 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:49 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:49 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 23 04:54:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 04:54:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:49.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 04:54:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:49 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:50 np0005593295 ceph-mon[75771]: Reconfiguring osd.0 (monmap changed)...
Jan 23 04:54:50 np0005593295 ceph-mon[75771]: Reconfiguring daemon osd.0 on compute-1
Jan 23 04:54:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:54:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:50 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c009ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:51 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:51.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:51 np0005593295 ceph-mon[75771]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 23 04:54:51 np0005593295 ceph-mon[75771]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 23 04:54:51 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:51 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:51.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:51 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:52 np0005593295 podman[90363]: 2026-01-23 09:54:52.270893417 +0000 UTC m=+0.047709238 container create 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:54:52 np0005593295 systemd[1]: Started libpod-conmon-1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b.scope.
Jan 23 04:54:52 np0005593295 podman[90363]: 2026-01-23 09:54:52.250179565 +0000 UTC m=+0.026995406 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:52 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:54:52 np0005593295 podman[90363]: 2026-01-23 09:54:52.37540016 +0000 UTC m=+0.152216001 container init 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Jan 23 04:54:52 np0005593295 podman[90363]: 2026-01-23 09:54:52.38506498 +0000 UTC m=+0.161880801 container start 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:54:52 np0005593295 fervent_keller[90380]: 167 167
Jan 23 04:54:52 np0005593295 ceph-mon[75771]: Reconfiguring node-exporter.compute-1 (unknown last config time)...
Jan 23 04:54:52 np0005593295 ceph-mon[75771]: Reconfiguring daemon node-exporter.compute-1 on compute-1
Jan 23 04:54:52 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:52 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:52 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:54:52 np0005593295 podman[90363]: 2026-01-23 09:54:52.393435191 +0000 UTC m=+0.170251012 container attach 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 23 04:54:52 np0005593295 systemd[1]: libpod-1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b.scope: Deactivated successfully.
Jan 23 04:54:52 np0005593295 podman[90363]: 2026-01-23 09:54:52.39736649 +0000 UTC m=+0.174182311 container died 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Jan 23 04:54:52 np0005593295 systemd[1]: var-lib-containers-storage-overlay-0e5cc20f44840a878b108cbc1f67133a8c09989ae827e057a2ebfa0b36f14ff4-merged.mount: Deactivated successfully.
Jan 23 04:54:52 np0005593295 podman[90363]: 2026-01-23 09:54:52.449767915 +0000 UTC m=+0.226583736 container remove 1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_keller, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 04:54:52 np0005593295 systemd[1]: libpod-conmon-1d129b112d968c9c6cc7557faa9abab01f5d2ae594e6911974bb38a68e2e975b.scope: Deactivated successfully.
Jan 23 04:54:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:52 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:53 np0005593295 podman[90461]: 2026-01-23 09:54:53.033692327 +0000 UTC m=+0.047536625 container create dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:54:53 np0005593295 systemd[1]: Started libpod-conmon-dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec.scope.
Jan 23 04:54:53 np0005593295 systemd[1]: Started libcrun container.
Jan 23 04:54:53 np0005593295 podman[90461]: 2026-01-23 09:54:53.012357951 +0000 UTC m=+0.026202279 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:54:53 np0005593295 podman[90461]: 2026-01-23 09:54:53.118322476 +0000 UTC m=+0.132166794 container init dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 23 04:54:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:53 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c00a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:53 np0005593295 podman[90461]: 2026-01-23 09:54:53.143342427 +0000 UTC m=+0.157186725 container start dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 23 04:54:53 np0005593295 focused_bohr[90477]: 167 167
Jan 23 04:54:53 np0005593295 systemd[1]: libpod-dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec.scope: Deactivated successfully.
Jan 23 04:54:53 np0005593295 podman[90461]: 2026-01-23 09:54:53.151725268 +0000 UTC m=+0.165569576 container attach dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Jan 23 04:54:53 np0005593295 podman[90461]: 2026-01-23 09:54:53.152579608 +0000 UTC m=+0.166423906 container died dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:54:53 np0005593295 systemd[1]: var-lib-containers-storage-overlay-99de69510b031bdf857af35900514889eda3e4df043bde2fb09067e31662f7d8-merged.mount: Deactivated successfully.
Jan 23 04:54:53 np0005593295 podman[90461]: 2026-01-23 09:54:53.200153012 +0000 UTC m=+0.213997310 container remove dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_bohr, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 04:54:53 np0005593295 systemd[1]: libpod-conmon-dfdb72613555460c5a24bba1a4e33d2efce25b8309a69866f71d4d886df29eec.scope: Deactivated successfully.
Jan 23 04:54:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:53.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 23 04:54:53 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:53.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:53 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:54 np0005593295 ceph-mon[75771]: Reconfiguring crash.compute-2 (unknown last config time)...
Jan 23 04:54:54 np0005593295 ceph-mon[75771]: Reconfiguring daemon crash.compute-2 on compute-2
Jan 23 04:54:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:54 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:55 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 04:54:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:55.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 04:54:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:55.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:55 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c00a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:56 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:57 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:57.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:57.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:57 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:54:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:54:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:54:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:58 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c00a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:58 np0005593295 ceph-mon[75771]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Jan 23 04:54:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:59 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:59.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:54:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:54:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:59.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:54:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:54:59 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:54:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:54:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:54:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:54:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:00 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:01 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c00a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:01.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:01.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:01 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:02 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:03 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:55:03 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:55:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:03 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 04:55:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:03.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 04:55:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:03.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:03 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:04 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:55:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:05 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:05.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000022s ======
Jan 23 04:55:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:05.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Jan 23 04:55:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:05 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:06 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034001d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:07 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:07.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:07.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:07 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:08 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:09 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034001d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:09.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:09.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:09 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:10 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:11 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:11.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:11.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:11 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034002a40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:12 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:13 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:13.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:13.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:13 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:14 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034002bc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:15 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:15.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:15.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:15 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:17 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0340034e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:17.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:17.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:17 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:18 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:19 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:19.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:19.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:19 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0340034e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:20 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:21 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:21.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:21.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:21 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:22 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0340034e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:23 np0005593295 python3.9[90778]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:55:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:23.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:23.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:24 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:25 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:25.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:25 np0005593295 python3.9[91067]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 04:55:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:25.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:25 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:26 np0005593295 python3.9[91221]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 04:55:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:26 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:27 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:27 np0005593295 python3.9[91373]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:55:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:27.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:27.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:27 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:28 np0005593295 python3.9[91526]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 04:55:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:28 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:29.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:29.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:30 np0005593295 python3.9[91680]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:55:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:30 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:31 np0005593295 python3.9[91833]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:55:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff010003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:31.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:31 np0005593295 python3.9[91911]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:55:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:55:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:31.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:55:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:32 np0005593295 python3.9[92065]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:55:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:32 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:33.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:33.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff03c0089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095533 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:55:34 np0005593295 python3.9[92221]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 04:55:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:34 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:34 np0005593295 python3.9[92375]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 04:55:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:35.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:35.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:35 np0005593295 python3.9[92529]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:55:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:36 np0005593295 python3.9[92683]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 04:55:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:37.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:37.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:37 np0005593295 python3.9[92860]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:55:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:38 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0140016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:39 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:39.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:39.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:39 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:40 np0005593295 python3.9[93015]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:55:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:40 np0005593295 python3.9[93169]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:55:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:40 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:41 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0140016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:41.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:41 np0005593295 python3.9[93247]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:55:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:41.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:41 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:42 np0005593295 python3.9[93399]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:55:42 np0005593295 python3.9[93479]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:55:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:42 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:43 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:43.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:43 np0005593295 python3.9[93632]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:55:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:43 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:44 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:55:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:44 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0140016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.942544) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162144942800, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2005, "num_deletes": 251, "total_data_size": 8191375, "memory_usage": 8441032, "flush_reason": "Manual Compaction"}
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162144984112, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5055840, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8516, "largest_seqno": 10515, "table_properties": {"data_size": 5047090, "index_size": 5308, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19703, "raw_average_key_size": 20, "raw_value_size": 5028702, "raw_average_value_size": 5332, "num_data_blocks": 236, "num_entries": 943, "num_filter_entries": 943, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162058, "oldest_key_time": 1769162058, "file_creation_time": 1769162144, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 41566 microseconds, and 19614 cpu microseconds.
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.984189) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5055840 bytes OK
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.984219) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.986184) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.986211) EVENT_LOG_v1 {"time_micros": 1769162144986207, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.986233) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8181745, prev total WAL file size 8181745, number of live WAL files 2.
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.987840) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4937KB)], [18(11MB)]
Jan 23 04:55:44 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162144988123, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17496594, "oldest_snapshot_seqno": -1}
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4120 keys, 13781623 bytes, temperature: kUnknown
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145106661, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 13781623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13748045, "index_size": 22204, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10309, "raw_key_size": 104991, "raw_average_key_size": 25, "raw_value_size": 13666591, "raw_average_value_size": 3317, "num_data_blocks": 954, "num_entries": 4120, "num_filter_entries": 4120, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162144, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.107304) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 13781623 bytes
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.108501) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 147.1 rd, 115.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 11.9 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(6.2) write-amplify(2.7) OK, records in: 4654, records dropped: 534 output_compression: NoCompression
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.108532) EVENT_LOG_v1 {"time_micros": 1769162145108518, "job": 8, "event": "compaction_finished", "compaction_time_micros": 118949, "compaction_time_cpu_micros": 44602, "output_level": 6, "num_output_files": 1, "total_output_size": 13781623, "num_input_records": 4654, "num_output_records": 4120, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145109531, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145111323, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:44.987659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111492) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:55:45.111496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:55:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:45 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:45.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:45 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:46 np0005593295 python3.9[93786]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:55:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:46 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:47 np0005593295 python3.9[93939]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 04:55:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:47.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:47.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0080016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:55:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:47 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:55:48 np0005593295 python3.9[94090]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:55:48 np0005593295 systemd[1]: session-19.scope: Deactivated successfully.
Jan 23 04:55:48 np0005593295 systemd[1]: session-19.scope: Consumed 9.049s CPU time.
Jan 23 04:55:48 np0005593295 systemd-logind[786]: Session 19 logged out. Waiting for processes to exit.
Jan 23 04:55:48 np0005593295 systemd-logind[786]: Removed session 19.
Jan 23 04:55:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:48 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:49 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003d90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:49.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:49.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:49 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:49 np0005593295 python3.9[94243]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:55:49 np0005593295 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 04:55:49 np0005593295 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 04:55:49 np0005593295 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 04:55:49 np0005593295 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 04:55:50 np0005593295 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 04:55:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:50 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0080016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:50 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:55:51 np0005593295 python3.9[94408]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 04:55:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:51 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:55:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:51.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:55:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:55:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:51.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:55:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:51 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:52 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:53 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:55:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:55:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:53.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:53 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:54 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:55 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:55.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:55 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:55 np0005593295 python3.9[94564]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:55:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095555 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:55:56 np0005593295 python3.9[94720]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:55:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:56 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:57 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:55:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:55:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:55:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:57.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:55:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:57 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:58 np0005593295 systemd[1]: session-37.scope: Deactivated successfully.
Jan 23 04:55:58 np0005593295 systemd[1]: session-37.scope: Consumed 1min 15.079s CPU time.
Jan 23 04:55:58 np0005593295 systemd-logind[786]: Session 37 logged out. Waiting for processes to exit.
Jan 23 04:55:58 np0005593295 systemd-logind[786]: Removed session 37.
Jan 23 04:55:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:58 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:59 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:55:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:59.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:55:59 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:55:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:55:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:55:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:55:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:00 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:01 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:01.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:01.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:01 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:02 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:03 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:03.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:03.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:03 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:04 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:05 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:05.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:05.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:05 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:06 np0005593295 systemd-logind[786]: New session 39 of user zuul.
Jan 23 04:56:06 np0005593295 systemd[1]: Started Session 39 of User zuul.
Jan 23 04:56:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:06 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:07 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:07 np0005593295 python3.9[95015]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:07.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:07.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:07 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:08 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:09 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:09.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:09.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:09 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:09 np0005593295 python3.9[95173]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 04:56:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:10 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:11 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:11.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:11 np0005593295 python3.9[95328]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:56:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:11.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:11 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff018003eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:11 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:11 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:12 np0005593295 python3.9[95413]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:56:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:56:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:56:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:12 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:13 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:13.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:13.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:13 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:14 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:14 np0005593295 python3.9[95569]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:15 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:15.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:15.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:15 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:16 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:17 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:17.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:17.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:17 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0240008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:17 np0005593295 python3.9[95751]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:56:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:18 np0005593295 python3.9[95906]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:18 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:19 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:19.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:19.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:19 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:19 np0005593295 python3.9[96058]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 04:56:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:20 np0005593295 python3.9[96210]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:20 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0240008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:21 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:21.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:21.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:21 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095621 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:56:21 np0005593295 python3.9[96368]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:22 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff0240008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:23 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:23 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:56:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:23.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:23.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:23 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:24 np0005593295 python3.9[96550]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:56:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:24 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:25 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:25.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:25.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:25 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002a50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:26 np0005593295 python3.9[96838]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 04:56:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:26 np0005593295 python3.9[96989]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:56:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:26 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:27 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:27.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:27.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:27 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:27 np0005593295 python3.9[97143]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:28 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024002a50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:29.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:29.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:29 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:30 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:31 np0005593295 python3.9[97300]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:31.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:31.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:31 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:32 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:33.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:56:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:33 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000051s ======
Jan 23 04:56:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:33.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Jan 23 04:56:33 np0005593295 python3.9[97455]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:56:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:34 np0005593295 python3.9[97611]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 23 04:56:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:34 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff034004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff008003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:35.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:35 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:35.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:35 np0005593295 systemd[1]: session-39.scope: Deactivated successfully.
Jan 23 04:56:35 np0005593295 systemd[1]: session-39.scope: Consumed 19.026s CPU time.
Jan 23 04:56:35 np0005593295 systemd-logind[786]: Session 39 logged out. Waiting for processes to exit.
Jan 23 04:56:35 np0005593295 systemd-logind[786]: Removed session 39.
Jan 23 04:56:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:56:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:56:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:36 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:56:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[85730]: 23/01/2026 09:56:37 : epoch 69734548 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff024003760 fd 48 proxy ignored for local
Jan 23 04:56:37 np0005593295 kernel: ganesha.nfsd[95571]: segfault at 50 ip 00007ff0bf73232e sp 00007ff052ffc210 error 4 in libntirpc.so.5.8[7ff0bf717000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 04:56:37 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 04:56:37 np0005593295 systemd[1]: Started Process Core Dump (PID 97645/UID 0).
Jan 23 04:56:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:37.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:37.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:38 np0005593295 systemd-coredump[97661]: Process 85734 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 62:#012#0  0x00007ff0bf73232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 04:56:38 np0005593295 systemd[1]: systemd-coredump@1-97645-0.service: Deactivated successfully.
Jan 23 04:56:38 np0005593295 systemd[1]: systemd-coredump@1-97645-0.service: Consumed 1.656s CPU time.
Jan 23 04:56:39 np0005593295 podman[97671]: 2026-01-23 09:56:39.027166118 +0000 UTC m=+0.031797858 container died 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Jan 23 04:56:39 np0005593295 systemd[1]: var-lib-containers-storage-overlay-a90f8a234e18b05e243a2b45741ab580a2f24f36b6337b5ad8040626fe6cbe4d-merged.mount: Deactivated successfully.
Jan 23 04:56:39 np0005593295 systemd[77796]: Created slice User Background Tasks Slice.
Jan 23 04:56:39 np0005593295 systemd[77796]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 04:56:39 np0005593295 podman[97671]: 2026-01-23 09:56:39.069839081 +0000 UTC m=+0.074470801 container remove 2988332eba883a20d1059e3b3728d769fb16998f721e57e41af0869ab3b7663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:56:39 np0005593295 systemd[77796]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 04:56:39 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 04:56:39 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 04:56:39 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.209s CPU time.
Jan 23 04:56:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:39.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:39.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:41.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:41 np0005593295 systemd-logind[786]: New session 40 of user zuul.
Jan 23 04:56:41 np0005593295 systemd[1]: Started Session 40 of User zuul.
Jan 23 04:56:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:41.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:42 np0005593295 python3.9[97873]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095643 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:56:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:43.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:43 np0005593295 python3.9[98028]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:56:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:43.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:44 np0005593295 python3.9[98223]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:56:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:45.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:45 np0005593295 systemd[1]: session-40.scope: Deactivated successfully.
Jan 23 04:56:45 np0005593295 systemd[1]: session-40.scope: Consumed 2.271s CPU time.
Jan 23 04:56:45 np0005593295 systemd-logind[786]: Session 40 logged out. Waiting for processes to exit.
Jan 23 04:56:45 np0005593295 systemd-logind[786]: Removed session 40.
Jan 23 04:56:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:45.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095645 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:56:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:47.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:47.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:49 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 2.
Jan 23 04:56:49 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:56:49 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.209s CPU time.
Jan 23 04:56:49 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:56:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:49.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:49 np0005593295 podman[98300]: 2026-01-23 09:56:49.482550706 +0000 UTC m=+0.023799297 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:56:49 np0005593295 podman[98300]: 2026-01-23 09:56:49.640893384 +0000 UTC m=+0.182141975 container create d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Jan 23 04:56:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:49.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:49 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:49 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:49 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:49 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:56:50 np0005593295 podman[98300]: 2026-01-23 09:56:50.285540153 +0000 UTC m=+0.826788744 container init d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 23 04:56:50 np0005593295 podman[98300]: 2026-01-23 09:56:50.290536854 +0000 UTC m=+0.831785425 container start d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:56:50 np0005593295 bash[98300]: d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1
Jan 23 04:56:50 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:50 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:56:50 np0005593295 systemd-logind[786]: New session 41 of user zuul.
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095650 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:56:50 np0005593295 systemd[1]: Started Session 41 of User zuul.
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:51.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:51 np0005593295 python3.9[98512]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:51.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:52 np0005593295 python3.9[98668]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:56:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:53.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:53.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:53 np0005593295 python3.9[98824]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:56:54 np0005593295 python3.9[98910]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:56:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:55.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:55.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:56 np0005593295 python3.9[99065]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:56:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:56 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:56:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:56 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:56:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:56 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 04:56:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:56:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:56:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:56:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:56:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:57.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:57.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:58 np0005593295 python3.9[99287]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:56:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:56:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:59.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:56:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:56:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:56:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:56:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:56:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:56:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:59.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:56:59 np0005593295 python3.9[99439]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:57:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:00 np0005593295 python3.9[99607]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:01 np0005593295 python3.9[99685]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:57:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:01.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:57:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:01.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:02 np0005593295 python3.9[99838]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:02 np0005593295 python3.9[99917]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:03.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:03 np0005593295 python3.9[100069]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:57:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:03.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:57:04 np0005593295 python3.9[100222]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:57:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:04 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:57:04 np0005593295 python3.9[100375]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd49c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:05.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:57:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:05.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:57:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:06 np0005593295 python3.9[100543]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095707 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:57:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:07.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:07 np0005593295 python3.9[100695]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:57:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:57:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:07.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:57:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:57:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:57:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:57:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:09.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:57:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:09.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:10 np0005593295 python3.9[100851]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:57:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:10 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:57:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:11 np0005593295 python3.9[101006]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:57:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:11.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:11.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:11 np0005593295 python3.9[101158]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:57:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:12 np0005593295 python3.9[101312]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:57:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:13.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:13.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:13 np0005593295 python3.9[101465]: ansible-service_facts Invoked
Jan 23 04:57:13 np0005593295 network[101482]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:57:13 np0005593295 network[101483]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:57:13 np0005593295 network[101484]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:57:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095714 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:57:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:57:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:15.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:57:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:15.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:57:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:17.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:57:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:57:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:17.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:57:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:19.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:19.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:20 np0005593295 python3.9[101969]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:57:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:21.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:21.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:23.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:23.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:23 np0005593295 python3.9[102205]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 04:57:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:57:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:57:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:57:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:57:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:25 np0005593295 python3.9[102359]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:25.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:57:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:25.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:57:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:25 np0005593295 python3.9[102437]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:26 np0005593295 python3.9[102591]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:27 np0005593295 python3.9[102669]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:27.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:27.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:28 np0005593295 python3.9[102823]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:29.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:29.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:57:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:57:30 np0005593295 python3.9[103003]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:57:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:31.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:31 np0005593295 python3.9[103087]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:57:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:31.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:32 np0005593295 systemd[1]: session-41.scope: Deactivated successfully.
Jan 23 04:57:32 np0005593295 systemd[1]: session-41.scope: Consumed 24.263s CPU time.
Jan 23 04:57:32 np0005593295 systemd-logind[786]: Session 41 logged out. Waiting for processes to exit.
Jan 23 04:57:32 np0005593295 systemd-logind[786]: Removed session 41.
Jan 23 04:57:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:33.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:33.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:35.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:35.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:37.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:37.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:38 np0005593295 systemd-logind[786]: New session 42 of user zuul.
Jan 23 04:57:38 np0005593295 systemd[1]: Started Session 42 of User zuul.
Jan 23 04:57:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:38 np0005593295 python3.9[103305]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:39.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:39.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:39 np0005593295 python3.9[103457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:40 np0005593295 python3.9[103536]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:40 np0005593295 systemd[1]: session-42.scope: Deactivated successfully.
Jan 23 04:57:40 np0005593295 systemd[1]: session-42.scope: Consumed 1.519s CPU time.
Jan 23 04:57:40 np0005593295 systemd-logind[786]: Session 42 logged out. Waiting for processes to exit.
Jan 23 04:57:40 np0005593295 systemd-logind[786]: Removed session 42.
Jan 23 04:57:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488001930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:41.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:41.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488001930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:43.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:43.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:45.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488001930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:45.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:47.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:47 np0005593295 systemd-logind[786]: New session 43 of user zuul.
Jan 23 04:57:47 np0005593295 systemd[1]: Started Session 43 of User zuul.
Jan 23 04:57:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:47.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:48 np0005593295 python3.9[103722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:57:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488002da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:49.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:49.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:49 np0005593295 python3.9[103879]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:50 np0005593295 python3.9[104056]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:51 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:51 np0005593295 python3.9[104134]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.f5_stt0h recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:51 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:51.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:51 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488002da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:51.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.117093) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272117337, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1429, "num_deletes": 252, "total_data_size": 4156598, "memory_usage": 4203608, "flush_reason": "Manual Compaction"}
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272135863, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1767345, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10521, "largest_seqno": 11944, "table_properties": {"data_size": 1762718, "index_size": 2087, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11771, "raw_average_key_size": 20, "raw_value_size": 1752658, "raw_average_value_size": 2980, "num_data_blocks": 94, "num_entries": 588, "num_filter_entries": 588, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162145, "oldest_key_time": 1769162145, "file_creation_time": 1769162272, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 18796 microseconds, and 7421 cpu microseconds.
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.135963) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1767345 bytes OK
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.136001) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.138574) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.138608) EVENT_LOG_v1 {"time_micros": 1769162272138602, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.138634) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4149955, prev total WAL file size 4149955, number of live WAL files 2.
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.140245) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323533' seq:0, type:0; will stop at (end)
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1725KB)], [21(13MB)]
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272140586, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15548968, "oldest_snapshot_seqno": -1}
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4244 keys, 13455121 bytes, temperature: kUnknown
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272256073, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13455121, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13422807, "index_size": 20620, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 107932, "raw_average_key_size": 25, "raw_value_size": 13341254, "raw_average_value_size": 3143, "num_data_blocks": 884, "num_entries": 4244, "num_filter_entries": 4244, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162272, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.256657) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13455121 bytes
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.258367) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 134.5 rd, 116.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.1 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(16.4) write-amplify(7.6) OK, records in: 4708, records dropped: 464 output_compression: NoCompression
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.258388) EVENT_LOG_v1 {"time_micros": 1769162272258378, "job": 10, "event": "compaction_finished", "compaction_time_micros": 115563, "compaction_time_cpu_micros": 52610, "output_level": 6, "num_output_files": 1, "total_output_size": 13455121, "num_input_records": 4708, "num_output_records": 4244, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272259029, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272261459, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.139799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261489) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261495) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:57:52.261498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:57:52 np0005593295 python3.9[104288]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:53 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:53 np0005593295 python3.9[104366]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.c2vsg48p recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:53 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:53.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:53 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:53.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:53 np0005593295 python3.9[104518]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:54 np0005593295 python3.9[104672]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:55 np0005593295 python3.9[104750]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:55 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488002da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:55 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:55.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:55 np0005593295 python3.9[104902]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:55 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:55.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:56 np0005593295 python3.9[104981]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:57:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:56 np0005593295 python3.9[105134]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:57:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:57.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:57:57 np0005593295 python3.9[105286]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:57 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:57.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:58 np0005593295 python3.9[105389]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:58 np0005593295 python3.9[105543]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:57:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:59 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:57:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:59 np0005593295 python3.9[105621]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:57:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:59 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:57:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:59.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:57:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:57:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:57:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:57:59 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:57:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:57:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:59.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:00 np0005593295 python3.9[105774]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:58:00 np0005593295 systemd[1]: Reloading.
Jan 23 04:58:00 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:58:00 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:58:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:01 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:01 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:01 np0005593295 python3.9[105964]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:01.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:01 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:01.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:01 np0005593295 python3.9[106042]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:02 np0005593295 python3.9[106196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:02 np0005593295 python3.9[106274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:03 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:03 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:58:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:03.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:58:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:03 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:03.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:03 np0005593295 python3.9[106426]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:58:03 np0005593295 systemd[1]: Reloading.
Jan 23 04:58:03 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:58:03 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:58:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:04 np0005593295 systemd[1]: Starting Create netns directory...
Jan 23 04:58:04 np0005593295 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:58:04 np0005593295 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:58:04 np0005593295 systemd[1]: Finished Create netns directory.
Jan 23 04:58:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:05 np0005593295 python3.9[106619]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:58:05 np0005593295 network[106636]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:58:05 np0005593295 network[106637]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:58:05 np0005593295 network[106638]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:58:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:58:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:05.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:58:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:05 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:05.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:07.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:07 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:07.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095808 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:58:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:09 np0005593295 python3.9[106907]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:09.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:09 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:58:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:09.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:58:09 np0005593295 python3.9[106985]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:10 np0005593295 python3.9[107139]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:11 np0005593295 python3.9[107291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:11.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:11 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:11.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:11 np0005593295 python3.9[107369]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:13 np0005593295 python3.9[107523]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 04:58:13 np0005593295 systemd[1]: Starting Time & Date Service...
Jan 23 04:58:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:13 np0005593295 systemd[1]: Started Time & Date Service.
Jan 23 04:58:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:13.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:13 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:58:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:13.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:58:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:14 np0005593295 python3.9[107680]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:14 np0005593295 python3.9[107833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:15 np0005593295 python3.9[107911]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:58:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:15.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:58:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:15 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd478001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:58:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:15.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:58:16 np0005593295 python3.9[108063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:16 np0005593295 python3.9[108143]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6lbejo7y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:17 np0005593295 python3.9[108295]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:17.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:17 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:58:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:17.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:58:17 np0005593295 python3.9[108398]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:18 np0005593295 python3.9[108552]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:58:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:58:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:19.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:19 np0005593295 python3[108705]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:58:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:19 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:58:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:19.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:58:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:20 np0005593295 python3.9[108859]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:21 np0005593295 python3.9[108937]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:21.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:21 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:21.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:22 np0005593295 python3.9[109090]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:22 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:58:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:22 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:58:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:22 np0005593295 python3.9[109216]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162301.3898644-896-8143406466741/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:23.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:23 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780020f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:23 np0005593295 python3.9[109368]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:23.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:24 np0005593295 python3.9[109447]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:25 np0005593295 python3.9[109600]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:25 np0005593295 python3.9[109678]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:25.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:58:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:25 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:25.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:26 np0005593295 python3.9[109831]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:26 np0005593295 python3.9[109910]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:27 np0005593295 python3.9[110062]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:58:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:27.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:27 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:27.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:28 np0005593295 python3.9[110218]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:29 np0005593295 python3.9[110371]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:29.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:29 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:29 np0005593295 python3.9[110590]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:29.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095830 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:58:30 np0005593295 python3.9[110758]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:58:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:58:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:31 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:58:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:31 np0005593295 python3.9[110910]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:58:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:31.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:31 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:31 np0005593295 systemd[1]: session-43.scope: Deactivated successfully.
Jan 23 04:58:31 np0005593295 systemd[1]: session-43.scope: Consumed 30.036s CPU time.
Jan 23 04:58:31 np0005593295 systemd-logind[786]: Session 43 logged out. Waiting for processes to exit.
Jan 23 04:58:31 np0005593295 systemd-logind[786]: Removed session 43.
Jan 23 04:58:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:31.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:33.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:33 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:33.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:35.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:35 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:35.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:36 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:36 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:58:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:37.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:37 np0005593295 systemd-logind[786]: New session 44 of user zuul.
Jan 23 04:58:37 np0005593295 systemd[1]: Started Session 44 of User zuul.
Jan 23 04:58:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:37 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:37.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:38 np0005593295 python3.9[111147]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 04:58:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:39 np0005593295 python3.9[111300]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:58:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:39.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:39 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd490004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:39.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:40 np0005593295 python3.9[111455]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 23 04:58:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:40 np0005593295 python3.9[111608]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.cgk9a7px follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:58:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd470004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:41.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:41 np0005593295 python3.9[111736]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.cgk9a7px mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162320.3867793-105-111277945519754/.source.cgk9a7px _original_basename=.nttzwjo3 follow=False checksum=6c63675b4fda7e0d01c328fcbe34dc890491aeeb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:41 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:41.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:42 np0005593295 python3.9[111890]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:58:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:43 np0005593295 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 04:58:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:43.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:43 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:43 np0005593295 python3.9[112045]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+cj2so8SS29oYZ1K+7e02qi6fVkGXJzGMkIN9mgJPLCBtQ6vpBYEObTZZXuMIHhdiMUAp6RDjs11OXDkAB9R7e2ncjMKn7J2EHbmceT7rNq9L0w+QaLKFxl+xdJQ9QtO9ioNgJFXXQZt/IOeE8S4I5yhEM5jn+YEW0LPbp99Wz1d1Ob4GI1t0hCEv/4ayC3nRIXkuIhl7mrV0s22F8NE8f0hZZKaw1u8xmmpbD8ZVBsC6cxWE3kIQBmHu8q9tylaZjLsjGxBDUF9ko3bxeppvLPDMem89VLQCWbgmOHl5ZIPsyNglusTIBUp8uA7g+Agz1uMojClMHnsZl68WjbCAVcRA9y/UgXphGyEYZCUJMv8CjYKzxriyHALZl6YFSyC5ELlEAxL8fyTwtXhQ1+e/lI9Ak3n4suC6JyH0NQ27MPIf7riyUFJLw9lZaDerZOkvI7/Y2PfRvdfyZ57g/xgGeLY0Ch30SFVC04lNXIpsOWbLBOg0BMP9ZiciAYAF9Yc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIreWuVcekgp7kF5pU+4TIKLHZyhuqd4Ly312ExEA5EG#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJWfXOTsTXqDhdGhW7VcUXsYqCS7TzCPyaa9/dA9e0xKjnni1/GRM8FdYXWYbGsNnBQFWk3/pXD6sj3jKzK34AM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA/6JnQZ3CFC7xgv4DrvdZizVbVnsolKcWkvqzGu1hFHGmOEb7ehbxGPHBnp2N9iRf13H12EI0qNI6A2f44V0oXE3SP+fpJ6PVYQRQpKqTEiweqZaHEyYE2FnKy0HDQisg5hwr1egYLjGXChdkyqWSokL1LqaCyD2+EcOzUvC/GuVQ7eQnQBIGBpYAnNzS/64KKOZ0+0soOPJGxVCma6JN/2GcCunX6j3HmkOOQeuEFETXfUPHh1ylu2+3yINl34ERJN5YwgR/S+BKENOsJTu5XkYTCvc90CuvfkoF9K5Y2yE5nKwZaSf7n2SbUPil2Zph4l7opsd5IKxi6k2mVzw/CO2NHr136BZ06+sKXytDgorWqWzqnci8zfxeYF3D7q7AXD+IDVMP5T6op93oS2enAQFHG1vTLB0otQqnxUgNANbJkrKgXAS8G8I1m2sPz+qOFuuZa2/nqhzrd6/DEur5VoW6n9c/OcrbfapLEzD1jQDmsQI7oZkT++dt3Ogb3Vk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIII1sLqY7Nqi1A3CKXLokfn1vrns/lK1gUkDNSlbek2o#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9QZXHUsthFMKA5Si4Htl7MIwK0G4VAltQgbo39JJHrgD7h27U1jbnuJQ1S2bBX8FMSkqf5TPmM7Gr9QOATO+4=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWbrXZxuAw0n/xJmOvWW/Qbg53ya2CuJKzcHA+OvDpHLHGxkEuiUhwKvqUbfSTzn0o1M00OYITJIvZVINGRtQC7hGvBPWLVBON097mcmnju857I72U3dGdvGhnEUHyrglCV+xSkafQTTlnY9B59EKImUs/kiwRy3cYDWkCgthJgiPA4QSw6WrzaqpY2ET+7n+yY31EOagGA3ufW43qFbHX4diFuXpS1I1PLvvA4KINlMlsFcyR29j4nQk/vb5hMpLmBOlfVH16CXZC98a0ltp9ib7F3e1Wjdogj92kxwfQMYIeQEBp11Tc/PY5U90J51oyk8xYOKfsP3+r9yczmfRDjwR3+tzUMKyZYAsKQVcOGQC7x9sEXg3mBeXRVrlIVZFMuNVcYq4CY40fDIybcI25GxgRbQR7ZUWODG1SL7RF02Z+LQB6APXkzxdQUWLWPryj/EtOgnHQ1I0+BJTWrqGkKbSj41jhRTfS+MZvRXAJ+fNyZFhpkHo54DrCii4cbyM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGRPkwTcFVg/dIKRq29iWBfkoVFqIQ1pXOCPxfcGWRFF#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGf/hJ2dg/PRwojw63FLyKqua+ChKP+2bc7Eb0p70H6ve1elFVeY8lVRXx33JWc2m/XfgSWPNcUs9zBG8QcFVak=#012 create=True mode=0644 path=/tmp/ansible.cgk9a7px state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:43.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:45 np0005593295 python3.9[112199]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.cgk9a7px' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:58:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:45.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:45 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:45.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:45 np0005593295 python3.9[112353]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.cgk9a7px state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:46 np0005593295 systemd[1]: session-44.scope: Deactivated successfully.
Jan 23 04:58:46 np0005593295 systemd[1]: session-44.scope: Consumed 5.826s CPU time.
Jan 23 04:58:46 np0005593295 systemd-logind[786]: Session 44 logged out. Waiting for processes to exit.
Jan 23 04:58:46 np0005593295 systemd-logind[786]: Removed session 44.
Jan 23 04:58:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:47.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:47 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd488003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:47.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd484002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000051s ======
Jan 23 04:58:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:49.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Jan 23 04:58:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:49 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4780043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:58:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:49.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[98316]: 23/01/2026 09:58:51 : epoch 697345e2 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd494001080 fd 38 proxy ignored for local
Jan 23 04:58:51 np0005593295 kernel: ganesha.nfsd[111658]: segfault at 50 ip 00007fd51fbef32e sp 00007fd4a8ff8210 error 4 in libntirpc.so.5.8[7fd51fbd4000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 23 04:58:51 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 04:58:51 np0005593295 systemd[1]: Started Process Core Dump (PID 112384/UID 0).
Jan 23 04:58:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:51.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:51.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:52 np0005593295 systemd-coredump[112385]: Process 98321 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 60:#012#0  0x00007fd51fbef32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 04:58:53 np0005593295 systemd[1]: systemd-coredump@2-112384-0.service: Deactivated successfully.
Jan 23 04:58:53 np0005593295 systemd[1]: systemd-coredump@2-112384-0.service: Consumed 1.649s CPU time.
Jan 23 04:58:53 np0005593295 podman[112392]: 2026-01-23 09:58:53.095259898 +0000 UTC m=+0.035250895 container died d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 04:58:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:53 np0005593295 systemd[1]: var-lib-containers-storage-overlay-b19ca06aee9ccd666281d5d22cdbe211ab7c13b8da2e9314aa375f7fd13de7bf-merged.mount: Deactivated successfully.
Jan 23 04:58:53 np0005593295 podman[112392]: 2026-01-23 09:58:53.273178711 +0000 UTC m=+0.213169688 container remove d693275e105c73aef6f04d631fc637ff536d8a3b7f8c2d079dfe0bd3e5450fb1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:58:53 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 04:58:53 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 04:58:53 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.128s CPU time.
Jan 23 04:58:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:53.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:53 np0005593295 systemd-logind[786]: New session 45 of user zuul.
Jan 23 04:58:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:53 np0005593295 systemd[1]: Started Session 45 of User zuul.
Jan 23 04:58:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:53.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:54 np0005593295 python3.9[112590]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:58:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:55.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:55.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:56 np0005593295 python3.9[112746]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 04:58:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:57 np0005593295 python3.9[112902]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:58:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095857 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:58:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:58:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:57.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:58:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:57.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:58 np0005593295 python3.9[113081]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:58:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:58 np0005593295 python3.9[113235]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:58:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:58:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:58:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:59.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:58:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:58:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:58:59 np0005593295 python3.9[113387]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:58:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:58:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:59.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:00 np0005593295 systemd[1]: session-45.scope: Deactivated successfully.
Jan 23 04:59:00 np0005593295 systemd[1]: session-45.scope: Consumed 4.300s CPU time.
Jan 23 04:59:00 np0005593295 systemd-logind[786]: Session 45 logged out. Waiting for processes to exit.
Jan 23 04:59:00 np0005593295 systemd-logind[786]: Removed session 45.
Jan 23 04:59:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:00 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:01.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:01 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:01.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:02 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:03 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 3.
Jan 23 04:59:03 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:59:03 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.128s CPU time.
Jan 23 04:59:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:59:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:03.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:59:03 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:03 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:03 np0005593295 podman[113462]: 2026-01-23 09:59:03.774987317 +0000 UTC m=+0.043076780 container create a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 04:59:03 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:03 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:03 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:03 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:03 np0005593295 podman[113462]: 2026-01-23 09:59:03.756076401 +0000 UTC m=+0.024165884 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:59:03 np0005593295 podman[113462]: 2026-01-23 09:59:03.852926509 +0000 UTC m=+0.121016002 container init a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 04:59:03 np0005593295 podman[113462]: 2026-01-23 09:59:03.858571927 +0000 UTC m=+0.126661390 container start a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Jan 23 04:59:03 np0005593295 bash[113462]: a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:59:03 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:59:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:03.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:59:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:03 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:59:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:04 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:05.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:05 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:05.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:06 np0005593295 systemd-logind[786]: New session 46 of user zuul.
Jan 23 04:59:06 np0005593295 systemd[1]: Started Session 46 of User zuul.
Jan 23 04:59:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:06 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:07 np0005593295 python3.9[113676]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:59:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:07.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:07 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:07.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:08 np0005593295 python3.9[113833]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:59:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:08 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:09 np0005593295 python3.9[113918]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:59:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:09.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:09 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:59:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:09.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:59:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:10 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:59:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:10 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:59:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:10 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:59:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:11.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:59:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:11 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:11 np0005593295 python3.9[114071]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:59:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:11.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:12 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:13 np0005593295 python3.9[114224]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:59:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:13 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:59:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:13.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:13 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:59:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:13 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:13.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:14 np0005593295 python3.9[114375]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:59:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:14 np0005593295 python3.9[114527]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:59:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:14 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:15 np0005593295 systemd-logind[786]: Session 46 logged out. Waiting for processes to exit.
Jan 23 04:59:15 np0005593295 systemd[1]: session-46.scope: Deactivated successfully.
Jan 23 04:59:15 np0005593295 systemd[1]: session-46.scope: Consumed 6.502s CPU time.
Jan 23 04:59:15 np0005593295 systemd-logind[786]: Removed session 46.
Jan 23 04:59:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:59:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:15.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:59:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:15 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:15.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:16 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:59:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:16 : epoch 69734667 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:59:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:17 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:17 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:17.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:17 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:17 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:17.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:18 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:19 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095919 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 04:59:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:19 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:19.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:19 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:19 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c4002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:19.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:20 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:21 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:21 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:21.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:21 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:21 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:21.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:22 np0005593295 systemd-logind[786]: New session 47 of user zuul.
Jan 23 04:59:22 np0005593295 systemd[1]: Started Session 47 of User zuul.
Jan 23 04:59:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:22 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:23 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:23 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4001cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:23 np0005593295 python3.9[114752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:59:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:23.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:23 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:23 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:59:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:23.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:59:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:24 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:25 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c4002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:25 np0005593295 python3.9[114910]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:25 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d0002390 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 04:59:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:25.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 04:59:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:25 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:25 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4001cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:25 np0005593295 python3.9[115062]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:25.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:26 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:26 np0005593295 python3.9[115216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:27 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:27 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b4001cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:27 np0005593295 python3.9[115339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162366.1671932-153-211043007712582/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=5f463a1334205a2aad5395f81514ee931215e9c5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:27.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:27 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:27 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c4002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:27.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:28 np0005593295 python3.9[115492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:28 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:28 np0005593295 python3.9[115616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162367.6790667-153-217915155126637/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=ff17d6d1438a69ae92e7570d79b66fb807ae4885 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:29 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:29 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:29 np0005593295 python3.9[115768]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:29.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:29 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:29 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:29 np0005593295 python3.9[115891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162368.8758805-153-249246164828380/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=74ae1654d12c23c4d6b67ccf19cdb7558a450192 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:29.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.349027) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370349244, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1159, "num_deletes": 251, "total_data_size": 2886732, "memory_usage": 2912480, "flush_reason": "Manual Compaction"}
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370364957, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1881107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11950, "largest_seqno": 13103, "table_properties": {"data_size": 1876037, "index_size": 2594, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10506, "raw_average_key_size": 19, "raw_value_size": 1865914, "raw_average_value_size": 3386, "num_data_blocks": 116, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162273, "oldest_key_time": 1769162273, "file_creation_time": 1769162370, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 16046 microseconds, and 7197 cpu microseconds.
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.365125) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1881107 bytes OK
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.365170) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.367702) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.367752) EVENT_LOG_v1 {"time_micros": 1769162370367745, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.367783) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 2881188, prev total WAL file size 2881188, number of live WAL files 2.
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.369035) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1837KB)], [24(12MB)]
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370369200, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15336228, "oldest_snapshot_seqno": -1}
Jan 23 04:59:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:30 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:30 np0005593295 python3.9[116045]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4279 keys, 13223563 bytes, temperature: kUnknown
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370877589, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 13223563, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13192084, "index_size": 19657, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109428, "raw_average_key_size": 25, "raw_value_size": 13110978, "raw_average_value_size": 3064, "num_data_blocks": 828, "num_entries": 4279, "num_filter_entries": 4279, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162370, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.974956) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 13223563 bytes
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.978468) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 30.2 rd, 26.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 12.8 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(15.2) write-amplify(7.0) OK, records in: 4795, records dropped: 516 output_compression: NoCompression
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.978515) EVENT_LOG_v1 {"time_micros": 1769162370978498, "job": 12, "event": "compaction_finished", "compaction_time_micros": 508562, "compaction_time_cpu_micros": 45799, "output_level": 6, "num_output_files": 1, "total_output_size": 13223563, "num_input_records": 4795, "num_output_records": 4279, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370979246, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370982482, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.368883) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982591) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982599) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982601) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:30 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-09:59:30.982604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:31 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36c4002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:31 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36d0009740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:31 np0005593295 python3.9[116197]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:31.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:31 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[113477]: 23/01/2026 09:59:31 : epoch 69734667 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f36b8002b10 fd 38 proxy ignored for local
Jan 23 04:59:31 np0005593295 kernel: ganesha.nfsd[114567]: segfault at 50 ip 00007f3759e1a32e sp 00007f36c2ffc210 error 4 in libntirpc.so.5.8[7f3759dff000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 23 04:59:31 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 04:59:31 np0005593295 systemd[1]: Started Process Core Dump (PID 116351/UID 0).
Jan 23 04:59:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:31.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:32 np0005593295 python3.9[116349]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:32 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:32 np0005593295 python3.9[116476]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162371.590063-325-153310206666673/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=d3518b087b935787ae8459844310cc45ab489248 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:33 np0005593295 systemd-coredump[116352]: Process 113481 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 53:#012#0  0x00007f3759e1a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f3759e24900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 23 04:59:33 np0005593295 python3.9[116628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:33 np0005593295 systemd[1]: systemd-coredump@3-116351-0.service: Deactivated successfully.
Jan 23 04:59:33 np0005593295 systemd[1]: systemd-coredump@3-116351-0.service: Consumed 1.400s CPU time.
Jan 23 04:59:33 np0005593295 podman[116641]: 2026-01-23 09:59:33.481914055 +0000 UTC m=+0.033789712 container died a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 23 04:59:33 np0005593295 systemd[1]: var-lib-containers-storage-overlay-a2582de70091ee01944df38385b4c144b2e2a6dee2eeb4da56efe2aee3d46bad-merged.mount: Deactivated successfully.
Jan 23 04:59:33 np0005593295 podman[116641]: 2026-01-23 09:59:33.526123435 +0000 UTC m=+0.077999072 container remove a5a5cf8558c88760fb11f4b695f5b085f0ad0a5e0be971312152ed9ff3df32c8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2)
Jan 23 04:59:33 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 04:59:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:33.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:33 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 04:59:33 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.583s CPU time.
Jan 23 04:59:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:33 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:33 np0005593295 python3.9[116798]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162372.8841522-325-34392993972033/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=7ea5769d722c11e7459792c631f886a53fdd1360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:33.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:34 np0005593295 python3.9[116952]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:34 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:35 np0005593295 python3.9[117075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162374.1205697-325-112555519153618/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=c7b2cc1434b948bda234b68388cbd799abca388a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:35.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:35 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:35 np0005593295 python3.9[117227]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:35.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:36 np0005593295 python3.9[117431]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:36 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:37 np0005593295 python3.9[117615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:37.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:37 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:37 np0005593295 python3.9[117738]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162376.8699226-503-40811081212946/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=377da3c129b85449f0af58d2fb6b8163dbdb149d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:38 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:38 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:38 np0005593295 python3.9[117917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:38 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:39 np0005593295 python3.9[118040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162378.0618517-503-158348946627568/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=7ea5769d722c11e7459792c631f886a53fdd1360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095939 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:59:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:39.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:59:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:59:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:39 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:39 np0005593295 python3.9[118192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:39.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:40 np0005593295 python3.9[118316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162379.2646806-503-23478166857840/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=d8c8d38f928b275e03f3fce0093c5c40b17e4fa7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:40 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:41.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:41 np0005593295 python3.9[118469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:41 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:41.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:42 np0005593295 python3.9[118622]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:42 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:42 np0005593295 python3.9[118746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162381.8422322-703-87408833851170/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:43.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:43 np0005593295 python3.9[118898]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:43 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:43 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 4.
Jan 23 04:59:43 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:59:43 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.583s CPU time.
Jan 23 04:59:43 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 04:59:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:43.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:44 np0005593295 podman[119043]: 2026-01-23 09:59:44.026744341 +0000 UTC m=+0.024273455 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 04:59:44 np0005593295 podman[119043]: 2026-01-23 09:59:44.216784822 +0000 UTC m=+0.214313916 container create 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Jan 23 04:59:44 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:44 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:44 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:44 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:44 np0005593295 python3.9[119108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:44 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:44 np0005593295 podman[119043]: 2026-01-23 09:59:44.770917798 +0000 UTC m=+0.768446902 container init 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 23 04:59:44 np0005593295 podman[119043]: 2026-01-23 09:59:44.77624462 +0000 UTC m=+0.773773714 container start 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:59:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:44 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 04:59:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:44 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 04:59:44 np0005593295 bash[119043]: 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9
Jan 23 04:59:44 np0005593295 python3.9[119237]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162383.8366396-772-219614496771385/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:44 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 04:59:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 04:59:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 04:59:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 04:59:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 04:59:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 04:59:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 04:59:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:45.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:45 np0005593295 python3.9[119428]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:45 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:45.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:46 np0005593295 python3.9[119581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:46 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:46 np0005593295 python3.9[119705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162385.854258-834-205280207384106/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:47.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:47 np0005593295 python3.9[119857]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:47 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 04:59:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:48.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 04:59:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:48 np0005593295 python3.9[120010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:48 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:48 np0005593295 python3.9[120134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162387.8260078-906-205682684487946/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:49 np0005593295 python3.9[120286]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:49.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:49 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:50.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:50 np0005593295 python3.9[120464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 04:59:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:50 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:50 np0005593295 python3.9[120588]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162389.7764578-971-195493562681958/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 04:59:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 04:59:51 np0005593295 python3.9[120740]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:59:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:51.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:51 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:52.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:52 np0005593295 python3.9[120894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:59:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:52 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:52 np0005593295 python3.9[121018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162391.6924145-1042-182145774667505/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:59:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:53.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:53 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:54.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:54 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:55 np0005593295 systemd[1]: session-47.scope: Deactivated successfully.
Jan 23 04:59:55 np0005593295 systemd[1]: session-47.scope: Consumed 23.274s CPU time.
Jan 23 04:59:55 np0005593295 systemd-logind[786]: Session 47 logged out. Waiting for processes to exit.
Jan 23 04:59:55 np0005593295 systemd-logind[786]: Removed session 47.
Jan 23 04:59:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 04:59:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:55.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 04:59:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:55 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:56.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:56 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/095956 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 04:59:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:57.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:57 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e10000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:58.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:58 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 09:59:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e08001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 04:59:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 04:59:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:59.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 09:59:59 2026: (VI_0) received an invalid passwd!
Jan 23 04:59:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 09:59:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:00.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:00 np0005593295 ceph-mon[75771]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:00:00 np0005593295 ceph-mon[75771]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:00:00 np0005593295 ceph-mon[75771]:     osd.1 observed slow operation indications in BlueStore
Jan 23 05:00:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100001 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:00:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e08001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:01.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:02.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:02 np0005593295 systemd-logind[786]: New session 48 of user zuul.
Jan 23 05:00:02 np0005593295 systemd[1]: Started Session 48 of User zuul.
Jan 23 05:00:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:03 np0005593295 python3.9[121249]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:03.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e08001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:04.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:04 np0005593295 python3.9[121401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:04 np0005593295 python3.9[121526]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162403.3548565-59-245424719842205/.source.conf _original_basename=ceph.conf follow=False checksum=c8d90d44a83782ff84a3d797d09c3b204e2d1c61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:05 np0005593295 python3.9[121678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:05.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:00:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:05 np0005593295 python3.9[121801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162404.917758-59-15592740715908/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=a6273c4bda164a032598e5e81cbd7f6e9c0876d5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:06.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:06 np0005593295 systemd[1]: session-48.scope: Deactivated successfully.
Jan 23 05:00:06 np0005593295 systemd[1]: session-48.scope: Consumed 2.714s CPU time.
Jan 23 05:00:06 np0005593295 systemd-logind[786]: Session 48 logged out. Waiting for processes to exit.
Jan 23 05:00:06 np0005593295 systemd-logind[786]: Removed session 48.
Jan 23 05:00:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e08001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:07.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:08.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:08 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:00:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:08 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:00:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:08 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:00:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:09.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:10.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:11.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:00:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:12 np0005593295 systemd-logind[786]: New session 49 of user zuul.
Jan 23 05:00:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:12 np0005593295 systemd[1]: Started Session 49 of User zuul.
Jan 23 05:00:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:13.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:13 np0005593295 python3.9[121987]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:00:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:14.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:15 np0005593295 python3.9[122145]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:15.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:15 np0005593295 python3.9[122297]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:16.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:16 np0005593295 python3.9[122449]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:00:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:17.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:17 np0005593295 python3.9[122601]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 05:00:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:18.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100018 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:00:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:19.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:20.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:20 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 23 05:00:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:20 np0005593295 python3.9[122787]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 05:00:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:21.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:21 np0005593295 python3.9[122871]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:00:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:22.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:23.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:24.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:24 np0005593295 python3.9[123028]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:00:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:25 np0005593295 python3[123183]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 23 05:00:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:25.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:26.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:26 np0005593295 python3.9[123337]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:27 np0005593295 python3.9[123489]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:27.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:27 np0005593295 python3.9[123567]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:28.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:28 np0005593295 python3.9[123721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:29 np0005593295 python3.9[123799]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.gjyi_25y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:29.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:29 np0005593295 python3.9[123953]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:30.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:30 np0005593295 python3.9[124033]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:31 np0005593295 python3.9[124185]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:31.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:32.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:32 np0005593295 python3[124340]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 05:00:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:33 np0005593295 python3.9[124492]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:33.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:34 np0005593295 python3.9[124617]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162432.7393332-429-13514922869121/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:34.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:35 np0005593295 python3.9[124771]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:35.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:35 np0005593295 python3.9[124896]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162434.2852414-474-106652998090111/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:36.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:36 np0005593295 python3.9[125050]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:37 np0005593295 python3.9[125175]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162435.9479222-519-222780313790841/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:37.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:37 np0005593295 python3.9[125327]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:38.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:38 np0005593295 python3.9[125479]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162437.420987-563-249308349665253/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:39 np0005593295 python3.9[125631]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:39.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:39 np0005593295 python3.9[125756]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162438.7279024-609-115942111541063/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:40.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:00:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2187 writes, 13K keys, 2187 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2187 writes, 2187 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2187 writes, 13K keys, 2187 commit groups, 1.0 writes per commit group, ingest: 36.20 MB, 0.06 MB/s#012Interval WAL: 2187 writes, 2187 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     89.8      0.23              0.10         6    0.038       0      0       0.0       0.0#012  L6      1/0   12.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.0     59.4     52.6      1.17              0.41         5    0.233     22K   2300       0.0       0.0#012 Sum      1/0   12.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0     49.7     58.7      1.40              0.51        11    0.127     22K   2300       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0     49.7     58.8      1.39              0.51        10    0.139     22K   2300       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0     59.4     52.6      1.17              0.41         5    0.233     22K   2300       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     90.7      0.23              0.10         5    0.045       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.020, interval 0.020#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.4 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 1.49 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.00017 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(87,1.28 MB,0.421373%) FilterBlock(11,73.42 KB,0.0235859%) IndexBlock(11,142.14 KB,0.0456609%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:00:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:40 np0005593295 python3.9[125910]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:41 np0005593295 python3.9[126062]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:41.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:42.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:42 np0005593295 python3.9[126218]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:43 np0005593295 python3.9[126371]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:43.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:43 np0005593295 python3.9[126524]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:00:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:44.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:44 np0005593295 python3.9[126680]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:45 np0005593295 python3.9[126835]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:45.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:46.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:46 np0005593295 python3.9[126987]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:00:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:47 np0005593295 ceph-osd[81231]: bluestore.MempoolThread fragmentation_score=0.000021 took=0.000131s
Jan 23 05:00:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:47.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:48 np0005593295 python3.9[127140]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:48 np0005593295 ovs-vsctl[127142]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 23 05:00:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:48.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:49 np0005593295 python3.9[127295]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:49.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:50.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:51.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:52.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:00:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:53.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:00:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:54.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:55 np0005593295 python3.9[127450]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:00:55 np0005593295 ovs-vsctl[127528]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 23 05:00:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:55.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:55 np0005593295 python3.9[127728]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:00:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:56.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:00:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:00:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:00:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:00:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:00:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:00:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:56 np0005593295 python3.9[127915]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6dec003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:57 np0005593295 python3.9[128067]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:57.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:57 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:57 np0005593295 python3.9[128145]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:58.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:58 np0005593295 python3.9[128324]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:00:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:59 np0005593295 python3.9[128402]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:00:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:00:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:00:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:00:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:00:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:59.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:00:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:00:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:00:59 np0005593295 python3.9[128555]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:00:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:00:59 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df80014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:01:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:00.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:01:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:00 np0005593295 python3.9[128709]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:01 np0005593295 python3.9[128787]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:01:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:01.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:01:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:01 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:02 np0005593295 python3.9[128951]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:02.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:02 np0005593295 python3.9[129031]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df80014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:03 np0005593295 python3.9[129183]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:01:03 np0005593295 systemd[1]: Reloading.
Jan 23 05:01:03 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:03 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:03.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:03 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:04.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:04 np0005593295 python3.9[129399]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8002260 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:01:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:01:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:01:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:05.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:01:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:05 np0005593295 python3.9[129478]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:05 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:06.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:06 np0005593295 python3.9[129632]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:07 np0005593295 python3.9[129710]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:07.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:07 np0005593295 python3.9[129862]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:01:07 np0005593295 systemd[1]: Reloading.
Jan 23 05:01:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:07 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:08 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:08 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:08.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:08 np0005593295 systemd[1]: Starting Create netns directory...
Jan 23 05:01:08 np0005593295 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 05:01:08 np0005593295 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 05:01:08 np0005593295 systemd[1]: Finished Create netns directory.
Jan 23 05:01:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:09 np0005593295 python3.9[130057]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:09.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:09 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:10.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:10 np0005593295 python3.9[130210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:11 np0005593295 python3.9[130334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162469.924888-1362-159345511888092/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8002ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8002ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:11.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:11 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:12 np0005593295 python3.9[130487]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:12.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:12 np0005593295 python3.9[130640]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:13 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:13 np0005593295 python3.9[130792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:13.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:14 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8002ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:14.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:14 np0005593295 python3.9[130916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162473.1873362-1460-259907696972181/.source.json _original_basename=.yzj27p_h follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:14 np0005593295 python3.9[131067]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:15 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de40032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:15.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:16 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:16.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:17 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:17 np0005593295 python3.9[131492]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 23 05:01:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:17.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:18 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:18.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:18 np0005593295 python3.9[131671]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 05:01:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:19 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:19.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:20 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:20 np0005593295 python3[131823]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 05:01:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:20.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:21 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:21.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:22 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:22.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:23 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:23.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:24 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:24.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100124 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:01:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:25 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:25.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:25 np0005593295 podman[131835]: 2026-01-23 10:01:25.920347942 +0000 UTC m=+5.695967912 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 05:01:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:26 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:26 np0005593295 podman[131961]: 2026-01-23 10:01:26.072366932 +0000 UTC m=+0.051834974 container create 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:01:26 np0005593295 podman[131961]: 2026-01-23 10:01:26.046566334 +0000 UTC m=+0.026034396 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 05:01:26 np0005593295 python3[131823]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 05:01:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:26.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:27 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:27.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:28 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:28.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:29 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:29 np0005593295 python3.9[132155]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:01:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:29.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:30 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df8003200 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:30.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:30 np0005593295 python3.9[132310]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:30 np0005593295 python3.9[132387]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:01:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:31 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:31 np0005593295 python3.9[132539]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769162490.9878035-1694-208021842378569/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:31.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:32 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:32 np0005593295 python3.9[132615]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:01:32 np0005593295 systemd[1]: Reloading.
Jan 23 05:01:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:32.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:32 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:32 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:33 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:33.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:34 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:34.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:34 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:01:34 np0005593295 python3.9[132729]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:01:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:34 np0005593295 systemd[1]: Reloading.
Jan 23 05:01:34 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:34 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:35 np0005593295 systemd[1]: Starting ovn_controller container...
Jan 23 05:01:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:35 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:01:35 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4214ae656bf086d4cf887d9c946eede27210a1dc28094e67a6a5cb3b8ef610d9/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 23 05:01:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:35 np0005593295 systemd[1]: Started /usr/bin/podman healthcheck run 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087.
Jan 23 05:01:35 np0005593295 podman[132773]: 2026-01-23 10:01:35.389581131 +0000 UTC m=+0.312058878 container init 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:01:35 np0005593295 ovn_controller[132789]: + sudo -E kolla_set_configs
Jan 23 05:01:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:35 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003d10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:35 np0005593295 podman[132773]: 2026-01-23 10:01:35.419463444 +0000 UTC m=+0.341941171 container start 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:01:35 np0005593295 edpm-start-podman-container[132773]: ovn_controller
Jan 23 05:01:35 np0005593295 systemd[1]: Created slice User Slice of UID 0.
Jan 23 05:01:35 np0005593295 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 23 05:01:35 np0005593295 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 23 05:01:35 np0005593295 systemd[1]: Starting User Manager for UID 0...
Jan 23 05:01:35 np0005593295 podman[132795]: 2026-01-23 10:01:35.503586872 +0000 UTC m=+0.071516537 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 05:01:35 np0005593295 systemd[1]: 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087-30cbb83d56284482.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 05:01:35 np0005593295 systemd[1]: 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087-30cbb83d56284482.service: Failed with result 'exit-code'.
Jan 23 05:01:35 np0005593295 edpm-start-podman-container[132772]: Creating additional drop-in dependency for "ovn_controller" (7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087)
Jan 23 05:01:35 np0005593295 systemd[1]: Reloading.
Jan 23 05:01:35 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:01:35 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:01:35 np0005593295 systemd[132824]: Queued start job for default target Main User Target.
Jan 23 05:01:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:35 np0005593295 systemd[132824]: Created slice User Application Slice.
Jan 23 05:01:35 np0005593295 systemd[132824]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 23 05:01:35 np0005593295 systemd[132824]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 05:01:35 np0005593295 systemd[132824]: Reached target Paths.
Jan 23 05:01:35 np0005593295 systemd[132824]: Reached target Timers.
Jan 23 05:01:35 np0005593295 systemd[132824]: Starting D-Bus User Message Bus Socket...
Jan 23 05:01:35 np0005593295 systemd[132824]: Starting Create User's Volatile Files and Directories...
Jan 23 05:01:35 np0005593295 systemd[132824]: Finished Create User's Volatile Files and Directories.
Jan 23 05:01:35 np0005593295 systemd[132824]: Listening on D-Bus User Message Bus Socket.
Jan 23 05:01:35 np0005593295 systemd[132824]: Reached target Sockets.
Jan 23 05:01:35 np0005593295 systemd[132824]: Reached target Basic System.
Jan 23 05:01:35 np0005593295 systemd[132824]: Reached target Main User Target.
Jan 23 05:01:35 np0005593295 systemd[132824]: Startup finished in 234ms.
Jan 23 05:01:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:01:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:35.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:01:35 np0005593295 systemd[1]: Started User Manager for UID 0.
Jan 23 05:01:35 np0005593295 systemd[1]: Started ovn_controller container.
Jan 23 05:01:35 np0005593295 systemd[1]: Started Session c1 of User root.
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: INFO:__main__:Validating config file
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: INFO:__main__:Writing out command to execute
Jan 23 05:01:36 np0005593295 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: ++ cat /run_command
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: + ARGS=
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: + sudo kolla_copy_cacerts
Jan 23 05:01:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:36 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:36 np0005593295 systemd[1]: Started Session c2 of User root.
Jan 23 05:01:36 np0005593295 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: + [[ ! -n '' ]]
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: + . kolla_extend_start
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: + umask 0022
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.0907] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.0916] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <warn>  [1769162496.0918] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.0927] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.0933] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.0938] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 05:01:36 np0005593295 kernel: br-int: entered promiscuous mode
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 23 05:01:36 np0005593295 systemd-udevd[132921]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:01:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 05:01:36 np0005593295 ovn_controller[132789]: 2026-01-23T10:01:36Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.1847] manager: (ovn-eb059b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.1857] manager: (ovn-170ec8-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.1864] manager: (ovn-57e418-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 23 05:01:36 np0005593295 kernel: genev_sys_6081: entered promiscuous mode
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.2011] device (genev_sys_6081): carrier: link connected
Jan 23 05:01:36 np0005593295 NetworkManager[48964]: <info>  [1769162496.2014] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Jan 23 05:01:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:36.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:37 np0005593295 python3.9[133052]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 05:01:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:37 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:37.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:38 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:38.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:38 np0005593295 python3.9[133206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:38 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:01:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:38 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:01:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:39 np0005593295 python3.9[133354]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162497.7788699-1830-160890388242940/.source.yaml _original_basename=.p8utvsxh follow=False checksum=a80724acad465d51ee59522dfe4a3a5c05876d7d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:01:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:39 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:39.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:39 np0005593295 python3.9[133506]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:01:39 np0005593295 ovs-vsctl[133507]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 23 05:01:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:40 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:40.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:40 np0005593295 python3.9[133661]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:01:40 np0005593295 ovs-vsctl[133663]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 23 05:01:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:41 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:41 np0005593295 python3.9[133816]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:01:41 np0005593295 ovs-vsctl[133817]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 23 05:01:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:01:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:41.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:01:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:42 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:42 np0005593295 systemd[1]: session-49.scope: Deactivated successfully.
Jan 23 05:01:42 np0005593295 systemd[1]: session-49.scope: Consumed 59.517s CPU time.
Jan 23 05:01:42 np0005593295 systemd-logind[786]: Session 49 logged out. Waiting for processes to exit.
Jan 23 05:01:42 np0005593295 systemd-logind[786]: Removed session 49.
Jan 23 05:01:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:42.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:42 : epoch 69734690 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:01:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:43 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:01:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:43.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:01:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:44 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:44.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df0001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:45 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:45.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:46 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:46.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:46 np0005593295 systemd[1]: Stopping User Manager for UID 0...
Jan 23 05:01:46 np0005593295 systemd[132824]: Activating special unit Exit the Session...
Jan 23 05:01:46 np0005593295 systemd[132824]: Stopped target Main User Target.
Jan 23 05:01:46 np0005593295 systemd[132824]: Stopped target Basic System.
Jan 23 05:01:46 np0005593295 systemd[132824]: Stopped target Paths.
Jan 23 05:01:46 np0005593295 systemd[132824]: Stopped target Sockets.
Jan 23 05:01:46 np0005593295 systemd[132824]: Stopped target Timers.
Jan 23 05:01:46 np0005593295 systemd[132824]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 05:01:46 np0005593295 systemd[132824]: Closed D-Bus User Message Bus Socket.
Jan 23 05:01:46 np0005593295 systemd[132824]: Stopped Create User's Volatile Files and Directories.
Jan 23 05:01:46 np0005593295 systemd[132824]: Removed slice User Application Slice.
Jan 23 05:01:46 np0005593295 systemd[132824]: Reached target Shutdown.
Jan 23 05:01:46 np0005593295 systemd[132824]: Finished Exit the Session.
Jan 23 05:01:46 np0005593295 systemd[132824]: Reached target Exit the Session.
Jan 23 05:01:46 np0005593295 systemd[1]: user@0.service: Deactivated successfully.
Jan 23 05:01:46 np0005593295 systemd[1]: Stopped User Manager for UID 0.
Jan 23 05:01:46 np0005593295 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 23 05:01:46 np0005593295 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 23 05:01:46 np0005593295 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 23 05:01:46 np0005593295 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 23 05:01:46 np0005593295 systemd[1]: Removed slice User Slice of UID 0.
Jan 23 05:01:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100146 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:01:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:47 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:47.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:48 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:48 np0005593295 systemd-logind[786]: New session 51 of user zuul.
Jan 23 05:01:48 np0005593295 systemd[1]: Started Session 51 of User zuul.
Jan 23 05:01:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:01:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:48.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:01:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:49 np0005593295 python3.9[134006]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:01:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:49 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:49.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:50 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df4003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:50.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:50 np0005593295 python3.9[134164]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:51 np0005593295 python3.9[134316]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:51 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:51.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:52 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:52 np0005593295 python3.9[134469]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:52.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:52 np0005593295 python3.9[134622]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:53 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6de4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:53 np0005593295 python3.9[134774]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:01:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:53.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:01:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:54 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6df00038c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:54.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:54 np0005593295 python3.9[134926]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:01:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:01:55 np0005593295 python3.9[135079]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 05:01:55 np0005593295 kernel: ganesha.nfsd[121050]: segfault at 50 ip 00007f6e9b73932e sp 00007f6e2dffa210 error 4 in libntirpc.so.5.8[7f6e9b71e000+2c000] likely on CPU 6 (core 0, socket 6)
Jan 23 05:01:55 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:01:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[119111]: 23/01/2026 10:01:55 : epoch 69734690 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e080030f0 fd 39 proxy ignored for local
Jan 23 05:01:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:55 np0005593295 systemd[1]: Started Process Core Dump (PID 135080/UID 0).
Jan 23 05:01:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:55.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:56 np0005593295 python3.9[135233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:57 np0005593295 systemd-coredump[135081]: Process 119239 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007f6e9b73932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:01:57 np0005593295 systemd[1]: systemd-coredump@4-135080-0.service: Deactivated successfully.
Jan 23 05:01:57 np0005593295 systemd[1]: systemd-coredump@4-135080-0.service: Consumed 1.841s CPU time.
Jan 23 05:01:57 np0005593295 podman[135332]: 2026-01-23 10:01:57.434834522 +0000 UTC m=+0.034190508 container died 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:01:57 np0005593295 systemd[1]: var-lib-containers-storage-overlay-61f999682fe7ba096df068ab99db190302f37de217dbe7d7604ba685fdad3a63-merged.mount: Deactivated successfully.
Jan 23 05:01:57 np0005593295 podman[135332]: 2026-01-23 10:01:57.478211748 +0000 UTC m=+0.077567734 container remove 4f256f8471d2d67936bf5479a009e13411000f6a3de9d5c1413008f3e0bb0af9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:01:57 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:01:57 np0005593295 python3.9[135368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162516.3327856-215-139474388768614/.source follow=False _original_basename=haproxy.j2 checksum=1daf285be4abb25cbd7ba376734de140aac9aefe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:57 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:01:57 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.156s CPU time.
Jan 23 05:01:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:57.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:58.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:59 np0005593295 python3.9[135577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:01:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:01:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:59 np0005593295 python3.9[135698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162518.6094809-260-79548490163396/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:01:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:01:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:01:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:01:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:59.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:00.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:01 np0005593295 python3.9[135852]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 05:02:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100201 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:02:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:01.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:01 np0005593295 python3.9[135936]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:02:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:02:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5367 writes, 23K keys, 5367 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5367 writes, 783 syncs, 6.85 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5367 writes, 23K keys, 5367 commit groups, 1.0 writes per commit group, ingest: 18.76 MB, 0.03 MB/s#012Interval WAL: 5367 writes, 783 syncs, 6.85 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 23 05:02:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:02.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:03.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:04.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:05 np0005593295 python3.9[136093]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:02:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:02:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:02:05 np0005593295 ovn_controller[132789]: 2026-01-23T10:02:05Z|00025|memory|INFO|17280 kB peak resident set size after 29.6 seconds
Jan 23 05:02:05 np0005593295 ovn_controller[132789]: 2026-01-23T10:02:05Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 23 05:02:05 np0005593295 podman[136300]: 2026-01-23 10:02:05.685091942 +0000 UTC m=+0.101186394 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 05:02:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:05.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:05 np0005593295 python3.9[136341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:06.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:06 np0005593295 python3.9[136474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162525.3347979-371-168894186340437/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:06 np0005593295 python3.9[136625]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:07 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:02:07 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:02:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:07 np0005593295 python3.9[136746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162526.498535-371-18692316727513/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:07.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:07 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 5.
Jan 23 05:02:07 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:02:07 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.156s CPU time.
Jan 23 05:02:07 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:02:08 np0005593295 podman[136816]: 2026-01-23 10:02:08.014764017 +0000 UTC m=+0.043632463 container create 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:02:08 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:08 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:08 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:08 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:08 np0005593295 podman[136816]: 2026-01-23 10:02:08.073788704 +0000 UTC m=+0.102657170 container init 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:02:08 np0005593295 podman[136816]: 2026-01-23 10:02:08.080987914 +0000 UTC m=+0.109856350 container start 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:02:08 np0005593295 bash[136816]: 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9
Jan 23 05:02:08 np0005593295 podman[136816]: 2026-01-23 10:02:07.994541841 +0000 UTC m=+0.023410307 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:02:08 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:02:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:08.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:08 np0005593295 python3.9[136999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:09 np0005593295 python3.9[137120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162528.3329885-503-60628928061541/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:09.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:09 np0005593295 python3.9[137270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:10.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:10 np0005593295 python3.9[137392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162529.51031-503-199991504748083/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:11 np0005593295 python3.9[137543]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:02:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:11.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:12 np0005593295 python3.9[137697]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:12.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:12 np0005593295 python3.9[137851]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:13 np0005593295 python3.9[137929]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:13 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:02:13 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:02:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:13 np0005593295 python3.9[138106]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:13.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:14 np0005593295 python3.9[138185]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:14.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:14 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:02:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:14 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:02:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:14 np0005593295 python3.9[138338]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:15 np0005593295 python3.9[138490]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:15.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:16 np0005593295 python3.9[138568]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:16.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:16 np0005593295 python3.9[138722]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:17 np0005593295 python3.9[138800]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:17.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:18 np0005593295 python3.9[138952]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:02:18 np0005593295 systemd[1]: Reloading.
Jan 23 05:02:18 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:18 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:18.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:19 np0005593295 python3.9[139167]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:19 np0005593295 python3.9[139245]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:19.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:20.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:20 np0005593295 python3.9[139398]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:02:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:20 np0005593295 python3.9[139488]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:21 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd960000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:21 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9540016c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:21 np0005593295 python3.9[139640]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:02:21 np0005593295 systemd[1]: Reloading.
Jan 23 05:02:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:21.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:21 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:21 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:22 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:22 np0005593295 systemd[1]: Starting Create netns directory...
Jan 23 05:02:22 np0005593295 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 05:02:22 np0005593295 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 05:02:22 np0005593295 systemd[1]: Finished Create netns directory.
Jan 23 05:02:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:22.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:22 np0005593295 python3.9[139841]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:23 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100223 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:02:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:23 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:23 np0005593295 python3.9[139993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:23.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:24 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:24 np0005593295 python3.9[140117]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162543.3174987-956-250022875305633/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:24.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:25 np0005593295 python3.9[140270]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:25 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:25 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:25.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:26 np0005593295 python3.9[140422]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:02:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:26 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:26.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:26 np0005593295 python3.9[140576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:02:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:27 np0005593295 python3.9[140699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162546.3612838-1055-748943752289/.source.json _original_basename=.eahzrann follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:27 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:27 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:27.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:28 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:28 np0005593295 python3.9[140850]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:28.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:29 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:29 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:29.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:30 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:30.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:30 np0005593295 python3.9[141275]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 23 05:02:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:31 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:31 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:31 np0005593295 python3.9[141428]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 05:02:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:31.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:32 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:32.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:32 np0005593295 python3[141582]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 05:02:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:33 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:33 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:33.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:34 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:34.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:34 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 05:02:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:35 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:35 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:35.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:36 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:36.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:37 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:37 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9400032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:37.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:38 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:38.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100238 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:02:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:39 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:39 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:39.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:40 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9400032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:02:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:40.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:02:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:41 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:41 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:41.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:42 np0005593295 podman[141664]: 2026-01-23 10:02:42.043903007 +0000 UTC m=+5.715004400 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:02:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:42 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:42.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:43 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:43 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:02:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:43.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:02:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:44 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:44.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:45 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:45 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:45.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:46 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:46.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:02:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:47.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:47 np0005593295 podman[141595]: 2026-01-23 10:02:47.941644504 +0000 UTC m=+15.208525819 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:02:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:48 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:48 np0005593295 podman[141783]: 2026-01-23 10:02:48.096845521 +0000 UTC m=+0.053823998 container create bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:02:48 np0005593295 podman[141783]: 2026-01-23 10:02:48.070493171 +0000 UTC m=+0.027471678 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:02:48 np0005593295 python3[141582]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:02:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:48.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:48 np0005593295 python3.9[141972]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:02:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:49 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:49 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:49 np0005593295 python3.9[142126]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:49.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:50 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:50 np0005593295 python3.9[142203]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:02:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:50.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:50 : epoch 69734720 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:02:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:50 : epoch 69734720 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:02:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:51 np0005593295 python3.9[142355]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769162570.2571516-1289-62413515576395/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:02:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:51 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:51 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:51 np0005593295 python3.9[142431]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:02:51 np0005593295 systemd[1]: Reloading.
Jan 23 05:02:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:51 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:51 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:51.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:52 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:52.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:52 np0005593295 python3.9[142544]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:02:52 np0005593295 systemd[1]: Reloading.
Jan 23 05:02:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:52 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:52 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:53 np0005593295 systemd[1]: Starting ovn_metadata_agent container...
Jan 23 05:02:53 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:02:53 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5293bc396b427585d7816b1a4e1106196f5aa403872ea13b3594ce59e34d84cf/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:53 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5293bc396b427585d7816b1a4e1106196f5aa403872ea13b3594ce59e34d84cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:53 np0005593295 systemd[1]: Started /usr/bin/podman healthcheck run bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3.
Jan 23 05:02:53 np0005593295 podman[142586]: 2026-01-23 10:02:53.148013865 +0000 UTC m=+0.122438221 container init bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + sudo -E kolla_set_configs
Jan 23 05:02:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:53 np0005593295 podman[142586]: 2026-01-23 10:02:53.177632345 +0000 UTC m=+0.152056691 container start bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 05:02:53 np0005593295 edpm-start-podman-container[142586]: ovn_metadata_agent
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Validating config file
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Copying service configuration files
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Writing out command to execute
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 23 05:02:53 np0005593295 edpm-start-podman-container[142585]: Creating additional drop-in dependency for "ovn_metadata_agent" (bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3)
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: ++ cat /run_command
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + CMD=neutron-ovn-metadata-agent
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + ARGS=
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + sudo kolla_copy_cacerts
Jan 23 05:02:53 np0005593295 systemd[1]: Reloading.
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + [[ ! -n '' ]]
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + . kolla_extend_start
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: Running command: 'neutron-ovn-metadata-agent'
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + umask 0022
Jan 23 05:02:53 np0005593295 ovn_metadata_agent[142601]: + exec neutron-ovn-metadata-agent
Jan 23 05:02:53 np0005593295 podman[142607]: 2026-01-23 10:02:53.276660017 +0000 UTC m=+0.086109255 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:02:53 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:02:53 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:02:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:53 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:53 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:53 np0005593295 systemd[1]: Started ovn_metadata_agent container.
Jan 23 05:02:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:53.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:53 : epoch 69734720 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:02:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:54 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:54.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:55 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.425 142606 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.425 142606 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.425 142606 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.426 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.426 142606 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.426 142606 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.427 142606 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.428 142606 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.429 142606 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.430 142606 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.431 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.432 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.433 142606 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.434 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.435 142606 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.436 142606 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.437 142606 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.438 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.439 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.440 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.441 142606 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.442 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.443 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.444 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.445 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.446 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.447 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.448 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.449 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.450 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.451 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.452 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.453 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.454 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.455 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.456 142606 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.457 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.458 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.459 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.460 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.461 142606 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 05:02:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:55 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.472 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.472 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.472 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.472 142606 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.473 142606 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.489 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 8fb585ea-168c-48ac-870f-617a4fa1bbde (UUID: 8fb585ea-168c-48ac-870f-617a4fa1bbde) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.520 142606 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.521 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.521 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.521 142606 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.525 142606 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.531 142606 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.538 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '8fb585ea-168c-48ac-870f-617a4fa1bbde'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], external_ids={}, name=8fb585ea-168c-48ac-870f-617a4fa1bbde, nb_cfg_timestamp=1769162504105, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.539 142606 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fefcfae0f40>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.540 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.541 142606 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.541 142606 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.541 142606 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.546 142606 DEBUG oslo_service.service [-] Started child 142717 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.549 142717 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-506844'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.551 142606 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpp12ig6ig/privsep.sock']#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.571 142717 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.572 142717 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.572 142717 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.578 142717 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.584 142717 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 23 05:02:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:55.594 142717 INFO eventlet.wsgi.server [-] (142717) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 23 05:02:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:55.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:56 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:56 np0005593295 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 23 05:02:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.335 142606 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.336 142606 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpp12ig6ig/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.127 142723 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.134 142723 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.137 142723 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.137 142723 INFO oslo.privsep.daemon [-] privsep daemon running as pid 142723#033[00m
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.338 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[528f047c-16e8-44a5-bcf0-49e980f2fbee]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:56.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.924 142723 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.924 142723 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:56.924 142723 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:57 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:57 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.685 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[1578408f-6cca-4f8d-af7a-979fe0e002ea]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.687 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, column=external_ids, values=({'neutron:ovn-metadata-id': '52d0a7f1-ccfa-5fc4-b009-582563d38ee8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.714 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.721 142606 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.721 142606 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.721 142606 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.722 142606 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.723 142606 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.724 142606 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.725 142606 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.726 142606 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.727 142606 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.728 142606 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.729 142606 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.730 142606 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.731 142606 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.732 142606 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.733 142606 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.734 142606 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.735 142606 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.736 142606 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.737 142606 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.738 142606 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.739 142606 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.740 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.741 142606 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.742 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.743 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.744 142606 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.745 142606 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.746 142606 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.747 142606 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.748 142606 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.749 142606 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.750 142606 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.751 142606 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.752 142606 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.753 142606 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.754 142606 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.755 142606 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.756 142606 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.757 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.758 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.759 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.760 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:02:57 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:02:57.761 142606 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 05:02:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:02:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:57.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:02:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:58 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9640023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:58.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:59 np0005593295 python3.9[142856]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 05:02:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:02:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:59 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:02:59 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:02:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:02:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:02:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:02:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:59.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:00 np0005593295 python3.9[143034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:03:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:00 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:00.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:00 np0005593295 python3.9[143160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162579.6608417-1425-117364402017054/.source.yaml _original_basename=.vpvltry9 follow=False checksum=d88282ad6bcd11f7bd2cbc3f4703eb6122d6b05d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100301 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:03:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:01 np0005593295 systemd[1]: session-51.scope: Deactivated successfully.
Jan 23 05:03:01 np0005593295 systemd[1]: session-51.scope: Consumed 1min 2.380s CPU time.
Jan 23 05:03:01 np0005593295 systemd-logind[786]: Session 51 logged out. Waiting for processes to exit.
Jan 23 05:03:01 np0005593295 systemd-logind[786]: Removed session 51.
Jan 23 05:03:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:01 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9640023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:01 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:01.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:02 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:02.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:03 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:03 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9640030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:03.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:04 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:04.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:05 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:05 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:05.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:06 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:06.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:07 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:07 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:03:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:07.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:03:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:08 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:08.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:09 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:09 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:09.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:10 np0005593295 systemd-logind[786]: New session 52 of user zuul.
Jan 23 05:03:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:10 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:10 np0005593295 systemd[1]: Started Session 52 of User zuul.
Jan 23 05:03:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:10.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:11 np0005593295 python3.9[143348]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:03:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:11 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:11 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:11.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:12 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:12 np0005593295 python3.9[143505]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:12.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:13 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:13 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:13.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:13 np0005593295 python3.9[143719]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:03:13 np0005593295 systemd[1]: Reloading.
Jan 23 05:03:14 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:03:14 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:03:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:14 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:14.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:15 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:15 np0005593295 python3.9[143941]: ansible-ansible.builtin.service_facts Invoked
Jan 23 05:03:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:15 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:15 np0005593295 network[143958]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 05:03:15 np0005593295 network[143959]: 'network-scripts' will be removed from distribution in near future.
Jan 23 05:03:15 np0005593295 network[143960]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 05:03:15 np0005593295 podman[143965]: 2026-01-23 10:03:15.66659435 +0000 UTC m=+0.105771040 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 23 05:03:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:15.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:16 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:16.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:17 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:03:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:03:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:03:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:03:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:17 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:17.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:18 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:18.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:19 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:19 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:19.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:20 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:20.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:21 np0005593295 python3.9[144279]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:21 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:21 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd954001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:21.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:22 np0005593295 python3.9[144432]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:22 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:22.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:22 np0005593295 python3.9[144587]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:23 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:23 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:23 np0005593295 python3.9[144740]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:23 np0005593295 podman[144742]: 2026-01-23 10:03:23.634369468 +0000 UTC m=+0.056545166 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:03:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:23.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:24 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:24 np0005593295 python3.9[144915]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:24.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:25 np0005593295 python3.9[145069]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:25 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:25 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:25 np0005593295 python3.9[145223]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:03:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:25.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:26 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:26.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:27 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:27 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:03:27 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:03:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:27 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:27 np0005593295 python3.9[145403]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:27.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:28 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:28 np0005593295 python3.9[145556]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:28.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:29 np0005593295 python3.9[145709]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:29 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:29 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c000d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:29 np0005593295 python3.9[145861]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:29.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:30 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:30.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:30 np0005593295 python3.9[146014]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:31 np0005593295 python3.9[146167]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:31 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:31 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:31.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:31 np0005593295 python3.9[146319]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:32 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:32.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:32 np0005593295 python3.9[146473]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:33 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:33 np0005593295 python3.9[146625]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:33 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:33.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:34 np0005593295 python3.9[146778]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:34 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:34.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:34 np0005593295 python3.9[146931]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:35 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:35 np0005593295 python3.9[147083]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:35 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:35.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:36 np0005593295 python3.9[147235]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:36 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:36.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:36 np0005593295 python3.9[147389]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:03:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:37 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:37 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:37 np0005593295 python3.9[147541]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:37.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:38 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:38.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:38 np0005593295 python3.9[147695]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 05:03:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:39 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:39 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:39 np0005593295 python3.9[147872]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:03:39 np0005593295 systemd[1]: Reloading.
Jan 23 05:03:39 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:03:39 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:03:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:39.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:40 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000048s ======
Jan 23 05:03:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:40.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Jan 23 05:03:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:40 np0005593295 python3.9[148060]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:41 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:41 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd93c001840 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:41 np0005593295 python3.9[148213]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:41.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:42 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9480040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:42 np0005593295 python3.9[148369]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:42.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:42 np0005593295 python3.9[148523]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:43 np0005593295 python3.9[148676]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:43 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:43 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:43.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:44 np0005593295 python3.9[148829]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:44 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:44.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:44 np0005593295 python3.9[148984]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:03:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:45 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9480040d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:45 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:45.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:46 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:46.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:46 np0005593295 podman[149012]: 2026-01-23 10:03:46.68523611 +0000 UTC m=+0.104280953 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:03:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:47 np0005593295 python3.9[149165]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 23 05:03:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:47 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd9480040f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:47.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:48 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd944003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:48 np0005593295 python3.9[149319]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 05:03:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:48.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:49 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd940001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:49 np0005593295 python3.9[149478]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 05:03:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:49 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd964004b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:03:49 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:03:49 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:03:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:03:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:49.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:03:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[136831]: 23/01/2026 10:03:50 : epoch 69734720 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd948004110 fd 39 proxy ignored for local
Jan 23 05:03:50 np0005593295 kernel: ganesha.nfsd[139642]: segfault at 50 ip 00007fd9e9f0e32e sp 00007fd9537fd210 error 4 in libntirpc.so.5.8[7fd9e9ef3000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 05:03:50 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:03:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:50 np0005593295 systemd[1]: Started Process Core Dump (PID 149536/UID 0).
Jan 23 05:03:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:50.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:51 np0005593295 python3.9[149643]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 05:03:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:51.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:52.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:52 np0005593295 python3.9[149728]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:03:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:03:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:03:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:54 np0005593295 systemd-coredump[149544]: Process 136835 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 52:#012#0  0x00007fd9e9f0e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007fd9e9f18900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 23 05:03:54 np0005593295 systemd[1]: systemd-coredump@5-149536-0.service: Deactivated successfully.
Jan 23 05:03:54 np0005593295 systemd[1]: systemd-coredump@5-149536-0.service: Consumed 4.061s CPU time.
Jan 23 05:03:54 np0005593295 podman[149738]: 2026-01-23 10:03:54.407640161 +0000 UTC m=+0.026756057 container died 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 05:03:54 np0005593295 systemd[1]: var-lib-containers-storage-overlay-ce9718135be574e551ecb587204459d33867a504fdc3c71066c66db68a17e904-merged.mount: Deactivated successfully.
Jan 23 05:03:54 np0005593295 podman[149738]: 2026-01-23 10:03:54.450211927 +0000 UTC m=+0.069327803 container remove 258224a89dca4f951aae15c4aabbdf8f7e57bade3608b553a304805da5c9f4f9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Jan 23 05:03:54 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:03:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100354 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:03:54 np0005593295 podman[149737]: 2026-01-23 10:03:54.484062507 +0000 UTC m=+0.090233146 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 05:03:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:54.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:54 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:03:54 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.207s CPU time.
Jan 23 05:03:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:03:55.464 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:03:55.466 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:03:55.466 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:55.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:56.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:57.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:03:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:58.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:03:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:03:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100359 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:03:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:03:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:03:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:03:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:59.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:00.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:01.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:02.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:04:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:03.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:04:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:04.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:04 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 6.
Jan 23 05:04:04 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:04:04 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.207s CPU time.
Jan 23 05:04:04 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:04:05 np0005593295 podman[150059]: 2026-01-23 10:04:05.051139481 +0000 UTC m=+0.050533482 container create cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:04:05 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:04:05 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:04:05 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:04:05 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:04:05 np0005593295 podman[150059]: 2026-01-23 10:04:05.106281094 +0000 UTC m=+0.105675125 container init cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:04:05 np0005593295 podman[150059]: 2026-01-23 10:04:05.112817274 +0000 UTC m=+0.112211275 container start cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:04:05 np0005593295 bash[150059]: cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf
Jan 23 05:04:05 np0005593295 podman[150059]: 2026-01-23 10:04:05.031666453 +0000 UTC m=+0.031060454 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:04:05 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:04:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:04:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:05.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:04:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:06.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:07.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:04:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:08.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:04:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:09.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:04:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:10.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:04:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 05:04:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 05:04:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:04:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:04:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:04:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:11.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:04:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:12.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:13.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100414 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:04:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:14.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:15.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:16.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:17 np0005593295 podman[150134]: 2026-01-23 10:04:17.689622376 +0000 UTC m=+0.107550681 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:04:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:17.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000013:nfs.cephfs.1: -2
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:04:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:18.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:19 np0005593295 kernel: SELinux:  Converting 2780 SID table entries...
Jan 23 05:04:19 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 05:04:19 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 05:04:19 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 05:04:19 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 05:04:19 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 05:04:19 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 05:04:19 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 05:04:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d00016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:19.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:20 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:20.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100421 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:04:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:21.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:22 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:22.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:23.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:24 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:24 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 23 05:04:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:24.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:24 np0005593295 podman[150216]: 2026-01-23 10:04:24.657749014 +0000 UTC m=+0.064322259 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 23 05:04:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:25.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:26 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:26.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:27 np0005593295 podman[150359]: 2026-01-23 10:04:27.608657903 +0000 UTC m=+0.095727570 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:04:27 np0005593295 podman[150359]: 2026-01-23 10:04:27.726074645 +0000 UTC m=+0.213144282 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Jan 23 05:04:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:28 np0005593295 podman[150457]: 2026-01-23 10:04:28.113314949 +0000 UTC m=+0.058977068 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:04:28 np0005593295 podman[150457]: 2026-01-23 10:04:28.13125299 +0000 UTC m=+0.076915099 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:04:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:28 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:28 np0005593295 podman[150561]: 2026-01-23 10:04:28.56394258 +0000 UTC m=+0.085919250 container exec cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:04:28 np0005593295 podman[150561]: 2026-01-23 10:04:28.599131523 +0000 UTC m=+0.121108153 container exec_died cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Jan 23 05:04:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:28.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:28 np0005593295 podman[150627]: 2026-01-23 10:04:28.803456559 +0000 UTC m=+0.052756706 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 05:04:28 np0005593295 podman[150627]: 2026-01-23 10:04:28.816213222 +0000 UTC m=+0.065513359 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 05:04:29 np0005593295 podman[150694]: 2026-01-23 10:04:29.118306467 +0000 UTC m=+0.141699589 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, architecture=x86_64, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, version=2.2.4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=Ceph keepalived)
Jan 23 05:04:29 np0005593295 podman[150694]: 2026-01-23 10:04:29.134375131 +0000 UTC m=+0.157768233 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.openshift.expose-services=, release=1793, version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, vcs-type=git, name=keepalived, architecture=x86_64)
Jan 23 05:04:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:29 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:29 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:30 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:30 np0005593295 kernel: SELinux:  Converting 2780 SID table entries...
Jan 23 05:04:30 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 05:04:30 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 05:04:30 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 05:04:30 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 05:04:30 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 05:04:30 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 05:04:30 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 05:04:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:30.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:04:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:04:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:31.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:32 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:32.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:33.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:34 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:34.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:35 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 23 05:04:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:35.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:36 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:04:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:36.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:04:36 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:36 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:04:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:04:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:37.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:04:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:38 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:38.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:39.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:40 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:40.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:41.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:42 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:42.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:43.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:44 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:44.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:45 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:45 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:45.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:46 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:46.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:47 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:47 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:47.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:48 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:04:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:48.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:04:48 np0005593295 podman[153723]: 2026-01-23 10:04:48.699057055 +0000 UTC m=+0.099987875 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:04:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:49 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:49 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:49.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:50 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:50.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:51 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:51 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:51.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:52 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:52.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:53 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:53 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:04:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:53.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:04:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:54 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:54.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:04:55.466 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:04:55.467 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:04:55.467 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:55 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:55 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:55 np0005593295 podman[158500]: 2026-01-23 10:04:55.629747235 +0000 UTC m=+0.048917022 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 05:04:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:55.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:56 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:56.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:57 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:57 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:57.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:58 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:04:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:58.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:04:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:04:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:59 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:04:59 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:04:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:04:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:04:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:04:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:59.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:00 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:00.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:01 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:01 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:01.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:02 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:02.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:03 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:03 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:03.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:04 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:04.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:05.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:06 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:06.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:07 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:07 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:05:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:07.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:05:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:08 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:08.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:09 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:09 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:09.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:10 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:10.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:11.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:12.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:13 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:13 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:05:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:13.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:05:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:14 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:05:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:14.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:05:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:15 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:15 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:15.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:16 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:16.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:18.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100518 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:05:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:18.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:19 np0005593295 podman[167907]: 2026-01-23 10:05:19.700797076 +0000 UTC m=+0.115047974 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:05:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:20.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:20 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:20.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:22.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:22 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:22.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:23 np0005593295 kernel: SELinux:  Converting 2781 SID table entries...
Jan 23 05:05:23 np0005593295 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 05:05:23 np0005593295 kernel: SELinux:  policy capability open_perms=1
Jan 23 05:05:23 np0005593295 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 05:05:23 np0005593295 kernel: SELinux:  policy capability always_check_network=0
Jan 23 05:05:23 np0005593295 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 05:05:23 np0005593295 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 05:05:23 np0005593295 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 05:05:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:24.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:24 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:24.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:25 np0005593295 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 05:05:25 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 23 05:05:25 np0005593295 dbus-broker-launch[744]: Noticed file-system modification, trigger reload.
Jan 23 05:05:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:26.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:26 np0005593295 podman[167984]: 2026-01-23 10:05:26.208236398 +0000 UTC m=+0.081703817 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:05:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:26 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:26.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:26 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:05:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:28.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:28 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:28.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:05:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:05:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:30.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:30 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:05:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:30.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:05:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:32.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:32 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:32.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:32 np0005593295 systemd[1]: Stopping OpenSSH server daemon...
Jan 23 05:05:32 np0005593295 systemd[1]: sshd.service: Deactivated successfully.
Jan 23 05:05:32 np0005593295 systemd[1]: Stopped OpenSSH server daemon.
Jan 23 05:05:32 np0005593295 systemd[1]: sshd.service: Consumed 2.134s CPU time, read 32.0K from disk, written 0B to disk.
Jan 23 05:05:32 np0005593295 systemd[1]: Stopped target sshd-keygen.target.
Jan 23 05:05:32 np0005593295 systemd[1]: Stopping sshd-keygen.target...
Jan 23 05:05:32 np0005593295 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 05:05:32 np0005593295 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 05:05:32 np0005593295 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 05:05:32 np0005593295 systemd[1]: Reached target sshd-keygen.target.
Jan 23 05:05:32 np0005593295 systemd[1]: Starting OpenSSH server daemon...
Jan 23 05:05:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:32 np0005593295 systemd[1]: Started OpenSSH server daemon.
Jan 23 05:05:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:05:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:34.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:34 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:34 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 05:05:34 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 05:05:34 np0005593295 systemd[1]: Reloading.
Jan 23 05:05:34 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:34 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:05:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:34.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:05:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:34 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 05:05:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c4004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:36.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:36 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:36.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:38.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:38 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:05:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:38.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:05:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:40.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:40 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100540 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:05:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:40.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:05:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:05:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:42.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:42 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 05:05:42 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 05:05:42 np0005593295 systemd[1]: man-db-cache-update.service: Consumed 9.706s CPU time.
Jan 23 05:05:42 np0005593295 systemd[1]: run-r619068a26e7d43b6b7f70f53699058ee.service: Deactivated successfully.
Jan 23 05:05:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:42 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:42.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:44.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:44 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:05:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:44.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:05:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:45 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:45 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:46.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:46 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:46.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:46 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:46 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:05:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:47 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:47 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:48.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:48 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ce0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:48.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:49 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:49 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:50.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:50 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:50 np0005593295 podman[177699]: 2026-01-23 10:05:50.698615748 +0000 UTC m=+0.126256730 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:05:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:50.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:51 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:51 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:52.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:52 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:52.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:52 np0005593295 python3.9[177855]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:05:52 np0005593295 systemd[1]: Reloading.
Jan 23 05:05:53 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:53 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:53 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:53 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:54.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:54 np0005593295 python3.9[178044]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:05:54 np0005593295 systemd[1]: Reloading.
Jan 23 05:05:54 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:54 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:54 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:05:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:54.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:05:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:55 np0005593295 python3.9[178236]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:05:55 np0005593295 systemd[1]: Reloading.
Jan 23 05:05:55 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:55 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:05:55.467 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:05:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:05:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:55 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:55 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:56.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:56 np0005593295 python3.9[178426]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:05:56 np0005593295 systemd[1]: Reloading.
Jan 23 05:05:56 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:56 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:56 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:56 np0005593295 podman[178467]: 2026-01-23 10:05:56.511607074 +0000 UTC m=+0.052308465 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:05:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:56.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:57 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:57 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:57 np0005593295 python3.9[178638]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:05:57 np0005593295 systemd[1]: Reloading.
Jan 23 05:05:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:57 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:57 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:58.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:58 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:58 np0005593295 python3.9[178832]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:05:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:05:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:58.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:58 np0005593295 systemd[1]: Reloading.
Jan 23 05:05:58 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:05:58 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:05:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:05:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:59 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:05:59 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:05:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:05:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:05:59 np0005593295 python3.9[179021]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:05:59 np0005593295 systemd[1]: Reloading.
Jan 23 05:06:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:00.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:00 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:06:00 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:06:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:00 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:00.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:00 np0005593295 python3.9[179238]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:01 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:01 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:01 np0005593295 python3.9[179393]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:01 np0005593295 systemd[1]: Reloading.
Jan 23 05:06:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:01 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:06:01 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:06:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:02.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:02 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:02.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100603 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:06:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:03 np0005593295 python3.9[179585]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 05:06:03 np0005593295 systemd[1]: Reloading.
Jan 23 05:06:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:03 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:06:03 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:06:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:03 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:03 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:03 np0005593295 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 23 05:06:03 np0005593295 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 23 05:06:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:04.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:04 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:04 np0005593295 python3.9[179780]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:04.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:05 np0005593295 python3.9[179935]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:05 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003cf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:06.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:06 np0005593295 python3.9[180090]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:06 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:06.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:07 np0005593295 python3.9[180247]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:07 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:07 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:07 np0005593295 python3.9[180402]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:08.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:08 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:08 np0005593295 python3.9[180559]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:06:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:08.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:06:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:09 np0005593295 python3.9[180714]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:09 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:09 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:10.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:10 np0005593295 python3.9[180869]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:10 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:10.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:10 np0005593295 python3.9[181026]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:11 np0005593295 python3.9[181181]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:11 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:06:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:06:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:12.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:06:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:12 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0003ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:12 np0005593295 python3.9[181337]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:12.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:13 np0005593295 python3.9[181493]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:13 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:13 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:13 np0005593295 python3.9[181648]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:14.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:14 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:14 np0005593295 python3.9[181805]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 05:06:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:06:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:14.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:06:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:14 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:06:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:14 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:06:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:15 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:15 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:16.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:16 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:16.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:17 : epoch 69734795 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:06:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:18.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:18 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:18.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:18 np0005593295 python3.9[181964]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:19 np0005593295 python3.9[182116]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:19 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:20 np0005593295 python3.9[182288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:20.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:20 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:20 np0005593295 python3.9[182447]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:20.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:21 np0005593295 podman[182571]: 2026-01-23 10:06:21.609304246 +0000 UTC m=+0.405412012 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Jan 23 05:06:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:21 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:21 np0005593295 python3.9[182620]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:22.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:22 np0005593295 python3.9[182778]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:06:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:22 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:22.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:23 np0005593295 python3.9[182929]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:06:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100623 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:06:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:23 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:24 np0005593295 python3.9[183081]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:24.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:24 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:24 np0005593295 python3.9[183208]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162783.418961-1644-38909013991840/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:24.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:25 np0005593295 python3.9[183360]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:25 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.892141) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785892360, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4241, "num_deletes": 502, "total_data_size": 11742776, "memory_usage": 11936904, "flush_reason": "Manual Compaction"}
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785962585, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4407138, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13108, "largest_seqno": 17344, "table_properties": {"data_size": 4395820, "index_size": 6404, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3845, "raw_key_size": 30531, "raw_average_key_size": 19, "raw_value_size": 4368991, "raw_average_value_size": 2857, "num_data_blocks": 279, "num_entries": 1529, "num_filter_entries": 1529, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162371, "oldest_key_time": 1769162371, "file_creation_time": 1769162785, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 70464 microseconds, and 14805 cpu microseconds.
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.962670) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4407138 bytes OK
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.962705) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.978398) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.978466) EVENT_LOG_v1 {"time_micros": 1769162785978455, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.978501) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11724047, prev total WAL file size 11724047, number of live WAL files 2.
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.981241) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323532' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4303KB)], [27(12MB)]
Jan 23 05:06:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785981425, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17630701, "oldest_snapshot_seqno": -1}
Jan 23 05:06:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:26.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:26 np0005593295 python3.9[183486]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162784.9020076-1644-213166137515331/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4980 keys, 13174161 bytes, temperature: kUnknown
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786148686, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13174161, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13139094, "index_size": 21517, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 124859, "raw_average_key_size": 25, "raw_value_size": 13046841, "raw_average_value_size": 2619, "num_data_blocks": 899, "num_entries": 4980, "num_filter_entries": 4980, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162785, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.149125) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13174161 bytes
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.162842) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.3 rd, 78.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 12.6 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(7.0) write-amplify(3.0) OK, records in: 5808, records dropped: 828 output_compression: NoCompression
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.162892) EVENT_LOG_v1 {"time_micros": 1769162786162873, "job": 14, "event": "compaction_finished", "compaction_time_micros": 167403, "compaction_time_cpu_micros": 43582, "output_level": 6, "num_output_files": 1, "total_output_size": 13174161, "num_input_records": 5808, "num_output_records": 4980, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786163716, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786166084, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:25.981058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166124) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:06:26.166128) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:26 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c0004080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:26 np0005593295 podman[183639]: 2026-01-23 10:06:26.631627408 +0000 UTC m=+0.052609967 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:06:26 np0005593295 python3.9[183640]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:26.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:27 np0005593295 python3.9[183784]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162786.2812843-1644-28256511466840/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:27 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:27 np0005593295 python3.9[183936]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:28.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:28 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:28 np0005593295 python3.9[184062]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162787.4281337-1644-80179343992359/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:28.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:28 np0005593295 python3.9[184215]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:29 np0005593295 python3.9[184340]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162788.5212467-1644-18048058307373/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:29 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:30.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:30 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:30 np0005593295 python3.9[184497]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:30.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:31 np0005593295 python3.9[184622]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162789.8751047-1644-275273793834793/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:31 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:31 np0005593295 python3.9[184774]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:32.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:32 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:32 np0005593295 python3.9[184898]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162791.4486651-1644-105620540118285/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:32.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:33 np0005593295 python3.9[185051]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:33 np0005593295 python3.9[185176]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162792.5588086-1644-143105795448352/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:33 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:34.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:34 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:34.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:35 np0005593295 python3.9[185330]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 23 05:06:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:35 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:36 np0005593295 python3.9[185483]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:36.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:36 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:36 np0005593295 python3.9[185637]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:36.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:37 np0005593295 python3.9[185789]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:37 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:37 np0005593295 python3.9[185941]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:38.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:38 np0005593295 python3.9[186094]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:38 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:38.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:38 np0005593295 python3.9[186247]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:39 np0005593295 python3.9[186399]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:39 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:40 np0005593295 python3.9[186551]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:40.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:40 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:40 np0005593295 python3.9[186730]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:06:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:40.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:06:41 np0005593295 python3.9[186882]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:41 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8004340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:41 np0005593295 python3.9[187034]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:42.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:42 np0005593295 python3.9[187187]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:42 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:42.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:42 np0005593295 python3.9[187340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0c8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:06:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[150074]: 23/01/2026 10:06:43 : epoch 69734795 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e00095a0 fd 38 proxy ignored for local
Jan 23 05:06:43 np0005593295 kernel: ganesha.nfsd[167951]: segfault at 50 ip 00007fd168b5932e sp 00007fd0e57f9210 error 4 in libntirpc.so.5.8[7fd168b3e000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 05:06:43 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:06:43 np0005593295 systemd[1]: Started Process Core Dump (PID 187493/UID 0).
Jan 23 05:06:43 np0005593295 python3.9[187492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:44.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:44.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:45 np0005593295 systemd-coredump[187494]: Process 150078 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007fd168b5932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:06:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:45 np0005593295 systemd[1]: systemd-coredump@6-187493-0.service: Deactivated successfully.
Jan 23 05:06:45 np0005593295 systemd[1]: systemd-coredump@6-187493-0.service: Consumed 1.518s CPU time.
Jan 23 05:06:45 np0005593295 podman[187525]: 2026-01-23 10:06:45.295332151 +0000 UTC m=+0.026159171 container died cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 05:06:45 np0005593295 systemd[1]: var-lib-containers-storage-overlay-a82868f9daeb7469a466e90c095fd894219075662e8ea6a1eab803fdf69c178b-merged.mount: Deactivated successfully.
Jan 23 05:06:45 np0005593295 podman[187525]: 2026-01-23 10:06:45.3360182 +0000 UTC m=+0.066845200 container remove cc3ecaa5c9bd162608adda7e4b6d8c9bc9b9da535becdeae815c6ec28c0a8abf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:06:45 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:06:45 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:06:45 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.052s CPU time.
Jan 23 05:06:45 np0005593295 python3.9[187695]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:46.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:46 np0005593295 python3.9[187819]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162805.30712-2307-86417166662119/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:46 np0005593295 python3.9[188051]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:46.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:47 np0005593295 python3.9[188174]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162806.4120848-2307-10540368233366/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:48.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:48 np0005593295 python3.9[188326]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:48 np0005593295 python3.9[188451]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162807.6564941-2307-131628813116022/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:06:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:48.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:06:49 np0005593295 python3.9[188603]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100649 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:06:49 np0005593295 python3.9[188726]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162808.7409387-2307-120372273617361/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:50.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:50 np0005593295 python3.9[188879]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:50 np0005593295 python3.9[189003]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162809.8273191-2307-88197046777382/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:06:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:50 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:06:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:50.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:51 np0005593295 python3.9[189155]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:51 np0005593295 python3.9[189278]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162810.8301108-2307-256139144719868/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:52.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:52 np0005593295 podman[189403]: 2026-01-23 10:06:52.329412572 +0000 UTC m=+0.083780689 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:06:52 np0005593295 python3.9[189450]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100652 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:06:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:06:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:52.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:06:52 np0005593295 python3.9[189581]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162812.0001886-2307-275382021497293/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:53 np0005593295 python3.9[189733]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:54.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:54 np0005593295 python3.9[189857]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162813.1640015-2307-82976802028080/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:54 np0005593295 python3.9[190010]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:54.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:55 np0005593295 python3.9[190133]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162814.2811522-2307-268013191509008/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:06:55.468 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:06:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:06:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:55 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 7.
Jan 23 05:06:55 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:06:55 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 2.052s CPU time.
Jan 23 05:06:55 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:06:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:55 np0005593295 podman[190331]: 2026-01-23 10:06:55.815786652 +0000 UTC m=+0.027075193 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:06:55 np0005593295 podman[190331]: 2026-01-23 10:06:55.932811436 +0000 UTC m=+0.144099967 container create cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:06:55 np0005593295 python3.9[190326]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:56 np0005593295 podman[190331]: 2026-01-23 10:06:56.101867321 +0000 UTC m=+0.313155852 container init cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 05:06:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:56.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:56 np0005593295 podman[190331]: 2026-01-23 10:06:56.109165292 +0000 UTC m=+0.320453813 container start cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:06:56 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:56 np0005593295 bash[190331]: cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291
Jan 23 05:06:56 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:06:56 np0005593295 python3.9[190537]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162815.4426026-2307-106569025354388/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:56.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:06:57 np0005593295 podman[190661]: 2026-01-23 10:06:57.122577348 +0000 UTC m=+0.053253402 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 05:06:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:57 np0005593295 python3.9[190708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:58 np0005593295 python3.9[190831]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162816.8482833-2307-215245772927508/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:58.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:58 np0005593295 python3.9[190985]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:06:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:06:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:58.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:06:59 np0005593295 python3.9[191108]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162818.2063055-2307-33683673485829/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:06:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:06:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:06:59 np0005593295 python3.9[191260]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:06:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:06:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:00.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:00 np0005593295 python3.9[191407]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162819.2830079-2307-86897461180381/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:01 np0005593295 python3.9[191562]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:01 np0005593295 python3.9[191685]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162820.340449-2307-51118746440456/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:02.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:02 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:07:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:02 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:07:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:02.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:03 np0005593295 python3.9[191837]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:04.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:04 np0005593295 python3.9[191993]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 23 05:07:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:04.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:05 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 05:07:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:05 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:07:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:05 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:07:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:05 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:07:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:06 np0005593295 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 23 05:07:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:06.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:06 np0005593295 python3.9[192151]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:06 np0005593295 python3.9[192304]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:06.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:07 np0005593295 python3.9[192456]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:08 np0005593295 python3.9[192609]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:08.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:08 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:08 np0005593295 python3.9[192777]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:08.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:09 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:09 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a14000fb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:10.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:10 np0005593295 python3.9[192930]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:10 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18001550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:10 np0005593295 python3.9[193083]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:11 np0005593295 python3.9[193235]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100711 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:07:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:11 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00001140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:11 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:11 np0005593295 python3.9[193387]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:12.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:12 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:12 np0005593295 python3.9[193541]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.834952) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832835343, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 710, "num_deletes": 251, "total_data_size": 1525849, "memory_usage": 1546608, "flush_reason": "Manual Compaction"}
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832847487, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 986082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17349, "largest_seqno": 18054, "table_properties": {"data_size": 982539, "index_size": 1387, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7958, "raw_average_key_size": 19, "raw_value_size": 975535, "raw_average_value_size": 2373, "num_data_blocks": 61, "num_entries": 411, "num_filter_entries": 411, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162786, "oldest_key_time": 1769162786, "file_creation_time": 1769162832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 12529 microseconds, and 8171 cpu microseconds.
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847598) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 986082 bytes OK
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847649) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.851185) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.851220) EVENT_LOG_v1 {"time_micros": 1769162832851209, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.851246) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1522032, prev total WAL file size 1522032, number of live WAL files 2.
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.852140) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(962KB)], [30(12MB)]
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832852304, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14160243, "oldest_snapshot_seqno": -1}
Jan 23 05:07:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:12.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4876 keys, 11785648 bytes, temperature: kUnknown
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832957101, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11785648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11752290, "index_size": 20064, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123329, "raw_average_key_size": 25, "raw_value_size": 11662851, "raw_average_value_size": 2391, "num_data_blocks": 835, "num_entries": 4876, "num_filter_entries": 4876, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.957418) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11785648 bytes
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.959046) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 134.9 rd, 112.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.6 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(26.3) write-amplify(12.0) OK, records in: 5391, records dropped: 515 output_compression: NoCompression
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.959066) EVENT_LOG_v1 {"time_micros": 1769162832959057, "job": 16, "event": "compaction_finished", "compaction_time_micros": 104930, "compaction_time_cpu_micros": 28842, "output_level": 6, "num_output_files": 1, "total_output_size": 11785648, "num_input_records": 5391, "num_output_records": 4876, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832959488, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832962718, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.851956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962855) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962861) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962862) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962864) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:12 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:07:12.962866) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:13 np0005593295 python3.9[193693]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:13 np0005593295 systemd[1]: Reloading.
Jan 23 05:07:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:13 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a180021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:13 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:13 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:13 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:13 np0005593295 systemd[1]: Starting libvirt logging daemon socket...
Jan 23 05:07:13 np0005593295 systemd[1]: Listening on libvirt logging daemon socket.
Jan 23 05:07:13 np0005593295 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 23 05:07:13 np0005593295 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 23 05:07:13 np0005593295 systemd[1]: Starting libvirt logging daemon...
Jan 23 05:07:14 np0005593295 systemd[1]: Started libvirt logging daemon.
Jan 23 05:07:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:14.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:14 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a14001c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100714 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:07:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:14 np0005593295 python3.9[193887]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:14 np0005593295 systemd[1]: Reloading.
Jan 23 05:07:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:14.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:14 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:14 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:15 np0005593295 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 23 05:07:15 np0005593295 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 23 05:07:15 np0005593295 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 23 05:07:15 np0005593295 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 23 05:07:15 np0005593295 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 23 05:07:15 np0005593295 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 23 05:07:15 np0005593295 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 23 05:07:15 np0005593295 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 05:07:15 np0005593295 systemd[1]: Started libvirt nodedev daemon.
Jan 23 05:07:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:15 np0005593295 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 23 05:07:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:15 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:15 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a180021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:15 np0005593295 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 23 05:07:15 np0005593295 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 23 05:07:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:15 np0005593295 python3.9[194110]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:15 np0005593295 systemd[1]: Reloading.
Jan 23 05:07:16 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:16 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:16.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:16 np0005593295 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 23 05:07:16 np0005593295 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 23 05:07:16 np0005593295 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 23 05:07:16 np0005593295 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 23 05:07:16 np0005593295 systemd[1]: Starting libvirt proxy daemon...
Jan 23 05:07:16 np0005593295 systemd[1]: Started libvirt proxy daemon.
Jan 23 05:07:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:16 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:16 np0005593295 setroubleshoot[193924]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 05acf4e4-e5f9-415e-a73e-9b333aec0c09
Jan 23 05:07:16 np0005593295 setroubleshoot[193924]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 23 05:07:16 np0005593295 setroubleshoot[193924]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 05acf4e4-e5f9-415e-a73e-9b333aec0c09
Jan 23 05:07:16 np0005593295 setroubleshoot[193924]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 23 05:07:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:16.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:17 np0005593295 python3.9[194328]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:17 np0005593295 systemd[1]: Reloading.
Jan 23 05:07:17 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:17 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:17 np0005593295 systemd[1]: Listening on libvirt locking daemon socket.
Jan 23 05:07:17 np0005593295 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 23 05:07:17 np0005593295 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 23 05:07:17 np0005593295 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 23 05:07:17 np0005593295 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 23 05:07:17 np0005593295 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 23 05:07:17 np0005593295 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 23 05:07:17 np0005593295 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 23 05:07:17 np0005593295 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 23 05:07:17 np0005593295 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 23 05:07:17 np0005593295 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 05:07:17 np0005593295 systemd[1]: Started libvirt QEMU daemon.
Jan 23 05:07:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:17 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:17 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a14001c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:18.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:18 np0005593295 python3.9[194544]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:07:18 np0005593295 systemd[1]: Reloading.
Jan 23 05:07:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:18 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18002ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:18 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:18 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:18 np0005593295 systemd[1]: Starting libvirt secret daemon socket...
Jan 23 05:07:18 np0005593295 systemd[1]: Listening on libvirt secret daemon socket.
Jan 23 05:07:18 np0005593295 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 23 05:07:18 np0005593295 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 23 05:07:18 np0005593295 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 23 05:07:18 np0005593295 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 23 05:07:18 np0005593295 systemd[1]: Starting libvirt secret daemon...
Jan 23 05:07:18 np0005593295 systemd[1]: Started libvirt secret daemon.
Jan 23 05:07:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:18.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:19 np0005593295 python3.9[194757]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:19 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:19 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:20.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:20 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:20 np0005593295 python3.9[194938]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 05:07:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:20.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:21 np0005593295 python3.9[195091]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:21 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:21 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:21 np0005593295 python3.9[195245]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 05:07:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:22.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:22 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18003800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:22 np0005593295 podman[195368]: 2026-01-23 10:07:22.693785292 +0000 UTC m=+0.105182161 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:07:22 np0005593295 python3.9[195407]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:22.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:23 np0005593295 python3.9[195544]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162842.3597481-3381-72835351102307/.source.xml follow=False _original_basename=secret.xml.j2 checksum=19688f6e42a741164eafec41a84b8e73a76d185a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:23 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:23 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:24 np0005593295 python3.9[195696]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine f3005f84-239a-55b6-a948-8f1fb592b920#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:24.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:24 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:24 np0005593295 python3.9[195860]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:24.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:25 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:25 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004120 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:26.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:26 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:26 np0005593295 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 23 05:07:26 np0005593295 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.028s CPU time.
Jan 23 05:07:26 np0005593295 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 23 05:07:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:26.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:27 np0005593295 python3.9[196325]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:27 np0005593295 podman[196449]: 2026-01-23 10:07:27.518402039 +0000 UTC m=+0.059700953 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:07:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:27 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:27 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8001ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:27 np0005593295 python3.9[196497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:28.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:28 np0005593295 python3.9[196621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162847.223294-3546-78742458669500/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:28 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004120 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:28.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:29 np0005593295 python3.9[196774]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:29 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:29 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:29 np0005593295 python3.9[196926]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:30.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:30 np0005593295 python3.9[197005]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:30 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8001ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:30 np0005593295 python3.9[197158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:30.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:31 np0005593295 python3.9[197236]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mba36d0j recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:31 np0005593295 auditd[703]: Audit daemon rotating log files
Jan 23 05:07:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:31 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:31 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8001ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:31 np0005593295 python3.9[197388]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:32.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:32 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:32 np0005593295 python3.9[197467]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:32.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:33 np0005593295 python3.9[197620]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:33 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:33 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:33 np0005593295 python3[197773]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 05:07:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:34.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:34 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8001ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:34 np0005593295 python3.9[197928]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:34.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:35 np0005593295 python3.9[198006]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:35 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:35 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:35 np0005593295 python3.9[198158]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:36.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:36 np0005593295 python3.9[198284]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162855.257572-3813-12043831208285/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:36 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:36 np0005593295 python3.9[198437]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:36.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:37 np0005593295 python3.9[198515]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:37 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:37 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a000030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:38 np0005593295 python3.9[198667]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:38.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:38 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:38 np0005593295 python3.9[198746]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:38.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:39 np0005593295 python3.9[198899]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:39 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:39 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:39 np0005593295 python3.9[199024]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162858.670145-3930-38712543530726/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:40.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:40 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a000030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:40 np0005593295 python3.9[199202]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:40.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:41 np0005593295 python3.9[199355]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:41 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:41 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:42 np0005593295 python3.9[199510]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:42.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:42 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:42 np0005593295 python3.9[199664]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:42.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:43 np0005593295 python3.9[199817]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:07:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:43 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:43 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:44.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:44 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a200095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:44 np0005593295 python3.9[199973]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:07:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:44.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:45 np0005593295 python3.9[200128]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:45 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:45 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:46 np0005593295 python3.9[200280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:46.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:46 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a18004e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:46 np0005593295 python3.9[200405]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162865.591543-4146-56111117212129/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:46.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:47 np0005593295 python3.9[200557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:47 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a20009f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:47 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99f8003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:47 np0005593295 python3.9[200680]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162866.7864878-4191-11101899059322/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:48.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:48 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9a00003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:48 np0005593295 python3.9[200834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:07:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:48.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:48 np0005593295 python3.9[200957]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162868.0675416-4236-208739899222965/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:07:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:49 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:07:49 np0005593295 kernel: ganesha.nfsd[194760]: segfault at 50 ip 00007f9aaabbe32e sp 00007f9a11ffa210 error 4 in libntirpc.so.5.8[7f9aaaba3000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 23 05:07:49 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:07:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[190347]: 23/01/2026 10:07:49 : epoch 69734840 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f99fc002b10 fd 38 proxy ignored for local
Jan 23 05:07:49 np0005593295 systemd[1]: Started Process Core Dump (PID 201110/UID 0).
Jan 23 05:07:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:49 np0005593295 python3.9[201109]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:07:49 np0005593295 systemd[1]: Reloading.
Jan 23 05:07:50 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:50 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:50.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:50 np0005593295 systemd[1]: Reached target edpm_libvirt.target.
Jan 23 05:07:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:50.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:51 np0005593295 systemd-coredump[201111]: Process 190355 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 57:#012#0  0x00007f9aaabbe32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f9aaabc8900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 23 05:07:51 np0005593295 python3.9[201306]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 05:07:51 np0005593295 systemd[1]: Reloading.
Jan 23 05:07:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:51 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:51 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:51 np0005593295 podman[201309]: 2026-01-23 10:07:51.338951704 +0000 UTC m=+0.038551528 container died cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:07:51 np0005593295 podman[201309]: 2026-01-23 10:07:51.376625159 +0000 UTC m=+0.076224953 container remove cc70552ffcc96627532b5a08d41512b300ebec5bdbe07c25e585b097491a9291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 23 05:07:51 np0005593295 systemd[1]: var-lib-containers-storage-overlay-4f9d26f84440f7e3bde72f9674f205f83ca47255b101bf5827a1c2646cf3b58f-merged.mount: Deactivated successfully.
Jan 23 05:07:51 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:07:51 np0005593295 systemd[1]: systemd-coredump@7-201110-0.service: Deactivated successfully.
Jan 23 05:07:51 np0005593295 systemd[1]: systemd-coredump@7-201110-0.service: Consumed 1.457s CPU time.
Jan 23 05:07:51 np0005593295 systemd[1]: Reloading.
Jan 23 05:07:51 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:07:51 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:07:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:51 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:07:51 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.810s CPU time.
Jan 23 05:07:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:52.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:52 np0005593295 systemd[1]: session-52.scope: Deactivated successfully.
Jan 23 05:07:52 np0005593295 systemd[1]: session-52.scope: Consumed 3min 24.487s CPU time.
Jan 23 05:07:52 np0005593295 systemd-logind[786]: Session 52 logged out. Waiting for processes to exit.
Jan 23 05:07:52 np0005593295 systemd-logind[786]: Removed session 52.
Jan 23 05:07:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:52.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:53 np0005593295 podman[201451]: 2026-01-23 10:07:53.670683112 +0000 UTC m=+0.096709101 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 05:07:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:54.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:54.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:07:55.469 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:07:55.470 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:07:55.470 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100755 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:07:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:56.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:56.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:07:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:07:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:07:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:07:57 np0005593295 podman[201566]: 2026-01-23 10:07:57.622616164 +0000 UTC m=+0.048764271 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:07:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:57 np0005593295 systemd-logind[786]: New session 53 of user zuul.
Jan 23 05:07:57 np0005593295 systemd[1]: Started Session 53 of User zuul.
Jan 23 05:07:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:58.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:58 np0005593295 python3.9[201742]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:07:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:07:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:07:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:58.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:07:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:07:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:07:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:07:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:00.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:00 np0005593295 python3.9[201897]: ansible-ansible.builtin.service_facts Invoked
Jan 23 05:08:00 np0005593295 network[201940]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 05:08:00 np0005593295 network[201941]: 'network-scripts' will be removed from distribution in near future.
Jan 23 05:08:00 np0005593295 network[201942]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 05:08:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:01.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:01 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 8.
Jan 23 05:08:01 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:08:01 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.810s CPU time.
Jan 23 05:08:01 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:08:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:02.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:02 np0005593295 podman[202049]: 2026-01-23 10:08:02.209384139 +0000 UTC m=+0.043938142 container create 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:08:02 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:08:02 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:08:02 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:08:02 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:02 np0005593295 podman[202049]: 2026-01-23 10:08:02.270440543 +0000 UTC m=+0.104994536 container init 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:08:02 np0005593295 podman[202049]: 2026-01-23 10:08:02.27673612 +0000 UTC m=+0.111290113 container start 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:08:02 np0005593295 bash[202049]: 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab
Jan 23 05:08:02 np0005593295 podman[202049]: 2026-01-23 10:08:02.191546926 +0000 UTC m=+0.026100939 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:08:02 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:08:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:03.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:03 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:08:03 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:08:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:04.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:05.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:05 np0005593295 python3.9[202341]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 05:08:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:06 np0005593295 python3.9[202425]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:08:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:06.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:07.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:08.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:08:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:08:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:09.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:08:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:10.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:08:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:11.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:12.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:12 np0005593295 python3.9[202586]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:08:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:13.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:13 np0005593295 python3.9[202738]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:14.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:14 np0005593295 python3.9[202892]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:08:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:15.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:15 np0005593295 python3.9[203058]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:15 np0005593295 python3.9[203211]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:08:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:08:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:16.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:08:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:16 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:16 np0005593295 python3.9[203337]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162895.3848317-243-133858745391706/.source.iscsi _original_basename=.1yc12p3o follow=False checksum=91c74ac05ca8208bb4cea7a74bd001dd4447ebae backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:17.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:17 np0005593295 python3.9[203489]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100817 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:08:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:18 np0005593295 python3.9[203642]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:18.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:18 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:19.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:19 np0005593295 python3.9[203795]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:08:19 np0005593295 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 23 05:08:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:20.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:20 np0005593295 python3.9[203951]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:08:20 np0005593295 systemd[1]: Reloading.
Jan 23 05:08:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:20 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:20 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:20 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:20 np0005593295 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 05:08:20 np0005593295 systemd[1]: Starting Open-iSCSI...
Jan 23 05:08:21 np0005593295 kernel: Loading iSCSI transport class v2.0-870.
Jan 23 05:08:21 np0005593295 systemd[1]: Started Open-iSCSI.
Jan 23 05:08:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:21.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:21 np0005593295 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 23 05:08:21 np0005593295 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 23 05:08:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700021c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:22 np0005593295 python3.9[204177]: ansible-ansible.builtin.service_facts Invoked
Jan 23 05:08:22 np0005593295 network[204194]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 05:08:22 np0005593295 network[204195]: 'network-scripts' will be removed from distribution in near future.
Jan 23 05:08:22 np0005593295 network[204196]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 05:08:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:22.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:22 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:23.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700021c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:23 np0005593295 podman[204247]: 2026-01-23 10:08:23.814531734 +0000 UTC m=+0.096371625 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:08:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:24.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:24 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:25.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:26.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:26 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700021c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:26 np0005593295 python3.9[204499]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:08:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:27.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f5c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:28.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:28 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:28 np0005593295 podman[204507]: 2026-01-23 10:08:28.637567109 +0000 UTC m=+0.055430471 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:08:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:29.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:30 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 05:08:30 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 05:08:30 np0005593295 systemd[1]: Reloading.
Jan 23 05:08:30 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:30 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:30.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:30 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 05:08:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:30 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:30 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 05:08:30 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 05:08:30 np0005593295 systemd[1]: run-rdaddf7c78ec94e55a5c922f3bcca0ef9.service: Deactivated successfully.
Jan 23 05:08:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:31.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:31 np0005593295 python3.9[204840]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 05:08:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:32.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:32 np0005593295 python3.9[204993]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 23 05:08:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:32 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:33 np0005593295 python3.9[205150]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:08:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:33.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:33 np0005593295 python3.9[205273]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162912.6237454-507-39111210236565/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:33 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:33 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:08:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:34.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:08:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:34 np0005593295 python3.9[205426]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:34 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:35.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:35 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:35 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:35 np0005593295 python3.9[205579]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:08:35 np0005593295 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 05:08:35 np0005593295 systemd[1]: Stopped Load Kernel Modules.
Jan 23 05:08:35 np0005593295 systemd[1]: Stopping Load Kernel Modules...
Jan 23 05:08:35 np0005593295 systemd[1]: Starting Load Kernel Modules...
Jan 23 05:08:35 np0005593295 systemd[1]: Finished Load Kernel Modules.
Jan 23 05:08:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:36.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:36 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:36 np0005593295 python3.9[205737]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:37.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:37 np0005593295 python3.9[205890]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:08:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:37 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:37 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50001b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:38 np0005593295 python3.9[206043]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:08:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:38.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:38 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:38 np0005593295 python3.9[206167]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162917.7339756-660-73664365490608/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:39.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:39 np0005593295 python3.9[206319]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:39 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:39 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f800095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:40 np0005593295 python3.9[206473]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:40.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:40 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f800095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:41.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:41 np0005593295 python3.9[206626]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:41 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:41 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:41 np0005593295 python3.9[206803]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:42.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:42 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:42 np0005593295 python3.9[206957]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:43.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:43 np0005593295 python3.9[207109]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:43 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f800095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:43 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:43 np0005593295 python3.9[207261]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:44.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:44 np0005593295 python3.9[207414]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:44 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:45.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:45 np0005593295 python3.9[207567]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:08:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:45 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:45 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f800095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:45 np0005593295 python3.9[207721]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:08:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:46.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:46 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:46 np0005593295 python3.9[207876]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:08:46 np0005593295 systemd[1]: Listening on multipathd control socket.
Jan 23 05:08:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:47.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:47 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:47 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:47 np0005593295 python3.9[208032]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:08:47 np0005593295 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 23 05:08:47 np0005593295 udevadm[208037]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 23 05:08:47 np0005593295 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 23 05:08:47 np0005593295 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 05:08:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:47 np0005593295 multipathd[208040]: --------start up--------
Jan 23 05:08:47 np0005593295 multipathd[208040]: read /etc/multipath.conf
Jan 23 05:08:47 np0005593295 multipathd[208040]: path checkers start up
Jan 23 05:08:47 np0005593295 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 05:08:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:48.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:48 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f8000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:48 np0005593295 python3.9[208201]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 05:08:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:49.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:49 np0005593295 python3.9[208353]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 23 05:08:49 np0005593295 kernel: Key type psk registered
Jan 23 05:08:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:49 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:49 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:50.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:50 np0005593295 python3.9[208515]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:08:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:50 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:50 np0005593295 python3.9[208639]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162929.9599311-1050-89432496677408/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:51.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:51 np0005593295 python3.9[208791]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:08:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:51 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f8000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:51 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:52.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:52 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:52 np0005593295 python3.9[208944]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:08:52 np0005593295 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 05:08:52 np0005593295 systemd[1]: Stopped Load Kernel Modules.
Jan 23 05:08:52 np0005593295 systemd[1]: Stopping Load Kernel Modules...
Jan 23 05:08:52 np0005593295 systemd[1]: Starting Load Kernel Modules...
Jan 23 05:08:52 np0005593295 systemd[1]: Finished Load Kernel Modules.
Jan 23 05:08:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:53.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:53 np0005593295 python3.9[209101]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 05:08:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:53 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:53 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f8000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:54.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:54 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:54 np0005593295 podman[209105]: 2026-01-23 10:08:54.682876386 +0000 UTC m=+0.105889215 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 23 05:08:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:55.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:08:55.471 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:08:55.472 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:08:55.472 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:55 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:55 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:55 np0005593295 systemd[1]: Reloading.
Jan 23 05:08:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:55 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:55 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:56 np0005593295 systemd[1]: Reloading.
Jan 23 05:08:56 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:56 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:08:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:56.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:08:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:56 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f8000a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:56 np0005593295 systemd-logind[786]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 05:08:56 np0005593295 systemd-logind[786]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 05:08:56 np0005593295 lvm[209246]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 05:08:56 np0005593295 lvm[209246]: VG ceph_vg0 finished
Jan 23 05:08:56 np0005593295 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 05:08:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:56 np0005593295 systemd[1]: Starting man-db-cache-update.service...
Jan 23 05:08:56 np0005593295 systemd[1]: Reloading.
Jan 23 05:08:56 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:08:56 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:08:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:57.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:57 np0005593295 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 05:08:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:57 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:57 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:58.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:58 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:58 np0005593295 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 05:08:58 np0005593295 systemd[1]: Finished man-db-cache-update.service.
Jan 23 05:08:58 np0005593295 systemd[1]: man-db-cache-update.service: Consumed 1.461s CPU time.
Jan 23 05:08:58 np0005593295 systemd[1]: run-r372e5d1c1d7046f8b4804c42f3fe1e95.service: Deactivated successfully.
Jan 23 05:08:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:08:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:59.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:08:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:08:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100859 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:08:59 np0005593295 podman[210476]: 2026-01-23 10:08:59.649676043 +0000 UTC m=+0.078147579 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:08:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:59 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:08:59 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:08:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:08:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.003000073s ======
Jan 23 05:09:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:00.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000073s
Jan 23 05:09:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:00 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:01.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:01 np0005593295 python3.9[210652]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:09:01 np0005593295 iscsid[204018]: iscsid shutting down.
Jan 23 05:09:01 np0005593295 systemd[1]: Stopping Open-iSCSI...
Jan 23 05:09:01 np0005593295 systemd[1]: iscsid.service: Deactivated successfully.
Jan 23 05:09:01 np0005593295 systemd[1]: Stopped Open-iSCSI.
Jan 23 05:09:01 np0005593295 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 05:09:01 np0005593295 systemd[1]: Starting Open-iSCSI...
Jan 23 05:09:01 np0005593295 systemd[1]: Started Open-iSCSI.
Jan 23 05:09:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:01 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:01 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:02.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:03.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:03 np0005593295 python3.9[210879]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:09:03 np0005593295 multipathd[208040]: exit (signal)
Jan 23 05:09:03 np0005593295 multipathd[208040]: --------shut down-------
Jan 23 05:09:03 np0005593295 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 23 05:09:03 np0005593295 systemd[1]: multipathd.service: Deactivated successfully.
Jan 23 05:09:03 np0005593295 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 23 05:09:03 np0005593295 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 05:09:03 np0005593295 multipathd[210897]: --------start up--------
Jan 23 05:09:03 np0005593295 multipathd[210897]: read /etc/multipath.conf
Jan 23 05:09:03 np0005593295 multipathd[210897]: path checkers start up
Jan 23 05:09:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:03 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:03 np0005593295 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 05:09:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:03 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:04.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:04 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:04 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:04 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:04 np0005593295 python3.9[211056]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 05:09:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:05.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:09:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:05 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:05 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.927611) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945927859, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1320, "num_deletes": 260, "total_data_size": 3219343, "memory_usage": 3265056, "flush_reason": "Manual Compaction"}
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945942148, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2098879, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18059, "largest_seqno": 19374, "table_properties": {"data_size": 2093326, "index_size": 2947, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11386, "raw_average_key_size": 18, "raw_value_size": 2082056, "raw_average_value_size": 3402, "num_data_blocks": 132, "num_entries": 612, "num_filter_entries": 612, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162834, "oldest_key_time": 1769162834, "file_creation_time": 1769162945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 14578 microseconds, and 6138 cpu microseconds.
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.942251) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2098879 bytes OK
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.942286) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.944061) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.944089) EVENT_LOG_v1 {"time_micros": 1769162945944084, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.944113) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3213119, prev total WAL file size 3213119, number of live WAL files 2.
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.945372) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323536' seq:0, type:0; will stop at (end)
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2049KB)], [33(11MB)]
Jan 23 05:09:05 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945945577, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13884527, "oldest_snapshot_seqno": -1}
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4954 keys, 13427325 bytes, temperature: kUnknown
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946033892, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13427325, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13392414, "index_size": 21425, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 126088, "raw_average_key_size": 25, "raw_value_size": 13300515, "raw_average_value_size": 2684, "num_data_blocks": 881, "num_entries": 4954, "num_filter_entries": 4954, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769162945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.034127) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13427325 bytes
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.035788) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.1 rd, 151.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.2 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(13.0) write-amplify(6.4) OK, records in: 5488, records dropped: 534 output_compression: NoCompression
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.035809) EVENT_LOG_v1 {"time_micros": 1769162946035796, "job": 18, "event": "compaction_finished", "compaction_time_micros": 88379, "compaction_time_cpu_micros": 29172, "output_level": 6, "num_output_files": 1, "total_output_size": 13427325, "num_input_records": 5488, "num_output_records": 4954, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946036662, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946038957, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:05.945121) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:09:06.039133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:06.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:06 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:06 np0005593295 python3.9[211214]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:07.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:07 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:07 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:08 np0005593295 python3.9[211367]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:09:08 np0005593295 systemd[1]: Reloading.
Jan 23 05:09:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:08.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:08 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:09:08 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:09:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:09:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:09 np0005593295 python3.9[211554]: ansible-ansible.builtin.service_facts Invoked
Jan 23 05:09:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:09 np0005593295 network[211571]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 05:09:09 np0005593295 network[211572]: 'network-scripts' will be removed from distribution in near future.
Jan 23 05:09:09 np0005593295 network[211573]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 05:09:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:09 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:09 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70001ab0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:10.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:10 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:09:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:11.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:09:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:11 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:11 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:09:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:09:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:09:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:12.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:12 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:13.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:13 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:13 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:13 np0005593295 python3.9[211875]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:14.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:14 np0005593295 python3.9[212030]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:09:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:15.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:15 np0005593295 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 23 05:09:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:15 np0005593295 python3.9[212183]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:16 np0005593295 python3.9[212337]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:16.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:16 np0005593295 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 05:09:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:16 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58003870 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:16 np0005593295 python3.9[212493]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:17.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:17 np0005593295 python3.9[212646]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f70003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:18.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:18 np0005593295 python3.9[212800]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:18 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:19 np0005593295 python3.9[212954]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:09:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:19.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:20.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:20 np0005593295 python3.9[213108]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:20 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:21 np0005593295 python3.9[213261]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:09:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:21.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:09:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/100921 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:09:21 np0005593295 python3.9[213438]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:22 np0005593295 python3.9[213591]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:22.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:22 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:22 np0005593295 python3.9[213744]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:23.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:23 np0005593295 python3.9[213896]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:24 np0005593295 python3.9[214048]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:24.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:24 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:24 np0005593295 python3.9[214202]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:25.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:25 np0005593295 podman[214326]: 2026-01-23 10:09:25.388755943 +0000 UTC m=+0.100376217 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:09:25 np0005593295 python3.9[214371]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:26 np0005593295 python3.9[214533]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:26.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:26 np0005593295 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 05:09:26 np0005593295 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 23 05:09:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:26 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:26 np0005593295 python3.9[214689]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:27.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:27 np0005593295 python3.9[214841]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:27 np0005593295 python3.9[214993]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:28.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:28 np0005593295 python3.9[215146]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:28 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:28 np0005593295 python3.9[215299]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:29.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:29 np0005593295 podman[215451]: 2026-01-23 10:09:29.738553103 +0000 UTC m=+0.048342662 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:09:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f700040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:29 np0005593295 python3.9[215452]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:09:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:30.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:30 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:30 np0005593295 python3.9[215624]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:31.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:31 np0005593295 python3.9[215776]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 05:09:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:32.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:32 np0005593295 python3.9[215931]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:09:32 np0005593295 systemd[1]: Reloading.
Jan 23 05:09:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:32 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:32 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:09:32 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:09:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:33.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:33 np0005593295 python3.9[216119]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:33 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:33 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:34 np0005593295 python3.9[216272]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:34.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:34 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:34 np0005593295 python3.9[216427]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:35.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:35 np0005593295 python3.9[216580]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:35 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:35 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:35 np0005593295 python3.9[216733]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:36.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:36 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:36 np0005593295 python3.9[216888]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:09:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:37.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:09:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:37 np0005593295 python3.9[217041]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:37 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:37 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:38.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:38 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:38 np0005593295 python3.9[217196]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 05:09:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:39.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:39 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:39 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:40.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:40 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:41.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:41 np0005593295 python3.9[217376]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:41 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:41 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:42.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:42 np0005593295 python3.9[217530]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:42 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:43 np0005593295 python3.9[217682]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:43.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:43 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58004580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:43 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:43 np0005593295 python3.9[217834]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:44.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:44 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80001e60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:44 np0005593295 python3.9[217988]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:45 np0005593295 python3.9[218140]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:45.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:45 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:45 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580045a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:46 np0005593295 python3.9[218292]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:46.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:46 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:46 np0005593295 python3.9[218446]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:47.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:47 np0005593295 python3.9[218598]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:47 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:47 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:47 np0005593295 python3.9[218750]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:09:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:48.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:48 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580045c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:49.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:49 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f540036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:49 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:50.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:50 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:51.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:51 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:51 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:09:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:52.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:09:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:52 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74001110 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:09:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:53.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:09:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:53 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:53 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:54.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:54 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:54 np0005593295 python3.9[218912]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 23 05:09:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:55.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:09:55.472 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:09:55.473 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:09:55.474 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:55 np0005593295 podman[219037]: 2026-01-23 10:09:55.687000365 +0000 UTC m=+0.124084792 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 23 05:09:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:55 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:55 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:55 np0005593295 python3.9[219084]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 05:09:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:56.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:56 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:09:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:57.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:09:57 np0005593295 python3.9[219251]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 05:09:57 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:09:57 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:09:57 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:09:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:57 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:57 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:09:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:58.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:09:58 np0005593295 systemd-logind[786]: New session 54 of user zuul.
Jan 23 05:09:58 np0005593295 systemd[1]: Started Session 54 of User zuul.
Jan 23 05:09:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:58 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:58 np0005593295 systemd[1]: session-54.scope: Deactivated successfully.
Jan 23 05:09:58 np0005593295 systemd-logind[786]: Session 54 logged out. Waiting for processes to exit.
Jan 23 05:09:58 np0005593295 systemd-logind[786]: Removed session 54.
Jan 23 05:09:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:09:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:59.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:09:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:59 np0005593295 python3.9[219440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:09:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:59 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:09:59 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:09:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:09:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:09:59 np0005593295 python3.9[219561]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162999.0075846-2657-58509827036090/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:00 np0005593295 podman[219563]: 2026-01-23 10:10:00.025899092 +0000 UTC m=+0.058068097 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:10:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:10:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:00.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:10:00 np0005593295 python3.9[219732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:00 np0005593295 ceph-mon[75771]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:10:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:00 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:00 np0005593295 python3.9[219808]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:01.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:01 np0005593295 python3.9[219958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:01 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:01 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:01 np0005593295 python3.9[220104]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163001.0560858-2657-199495556742565/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:02.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:02 np0005593295 python3.9[220256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:02 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:03 np0005593295 python3.9[220377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163002.1102335-2657-31445433639788/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:03.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:03 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:03 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:03 np0005593295 python3.9[220527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:04 np0005593295 python3.9[220649]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163003.1510732-2657-27368902594632/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:04.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:04 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74002bc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:04 np0005593295 python3.9[220800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:05.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:05 np0005593295 python3.9[220921]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163004.4312184-2657-35097282766233/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:05 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:05 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:06.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:06 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:10:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:07.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:10:07 np0005593295 python3.9[221075]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:10:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:07 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f74002bc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:07 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:07 np0005593295 python3.9[221227]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:10:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:08.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:08 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:08 np0005593295 python3.9[221381]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:09.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:09 np0005593295 python3.9[221533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:09 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:09 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:09 np0005593295 python3.9[221656]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769163009.0552902-2980-68720861896106/.source _original_basename=.qvr9pjna follow=False checksum=7ec3f985e290d2ef791d15595ca8f7c7030c8ec4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 23 05:10:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:10.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:10 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:10 np0005593295 python3.9[221811]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:11.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:11 np0005593295 python3.9[221988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:11 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:12 np0005593295 python3.9[222158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163011.1817672-3056-124980180761381/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=53b8456782b81b5794d3eef3fadcfb00db1088a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:12.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:12 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:10:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:10:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:12 np0005593295 python3.9[222319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 05:10:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:13.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:13 np0005593295 python3.9[222440]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163012.3389883-3101-258636953608352/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 05:10:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:13 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:13 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:14.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:14 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:14 np0005593295 python3.9[222594]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 23 05:10:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:15.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:15 np0005593295 python3.9[222746]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 05:10:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:15 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:10:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:16.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:10:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:16 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:16 np0005593295 python3[222900]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 05:10:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:10:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:17.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:10:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:17 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:18 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:18 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:10:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:18.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:18 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003a00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:19.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:19 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:20.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:20 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:21.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:21 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0043f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:22.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:22 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:23.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:23 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f54003a40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:24.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:24 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580014d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:25.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:25 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:26.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:26 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50002050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:27.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580014d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:27 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f80009cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:28.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:28 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:28 np0005593295 podman[223051]: 2026-01-23 10:10:28.780583362 +0000 UTC m=+2.207214381 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:10:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:29 np0005593295 podman[222915]: 2026-01-23 10:10:29.255692891 +0000 UTC m=+12.234540152 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 05:10:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:29.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:29 np0005593295 podman[223102]: 2026-01-23 10:10:29.420849371 +0000 UTC m=+0.058015506 container create ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 05:10:29 np0005593295 podman[223102]: 2026-01-23 10:10:29.39109285 +0000 UTC m=+0.028259035 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 05:10:29 np0005593295 python3[222900]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 23 05:10:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f50002050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:29 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f580014d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:30 np0005593295 podman[223265]: 2026-01-23 10:10:30.241543426 +0000 UTC m=+0.044128975 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:10:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:30.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:30 np0005593295 python3.9[223312]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:30 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:10:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:31.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:31 np0005593295 python3.9[223467]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 23 05:10:31 np0005593295 kernel: ganesha.nfsd[218780]: segfault at 50 ip 00007f8000eeb32e sp 00007f7f7affc210 error 4 in libntirpc.so.5.8[7f8000ed0000+2c000] likely on CPU 4 (core 0, socket 4)
Jan 23 05:10:31 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:10:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[202067]: 23/01/2026 10:10:31 : epoch 69734882 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f740038d0 fd 48 proxy ignored for local
Jan 23 05:10:31 np0005593295 systemd[1]: Started Process Core Dump (PID 223492/UID 0).
Jan 23 05:10:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:32.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:32 np0005593295 python3.9[223623]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 05:10:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:33.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:34.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:34 np0005593295 python3[223775]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 05:10:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:34 np0005593295 podman[223812]: 2026-01-23 10:10:34.674853522 +0000 UTC m=+0.022331940 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 05:10:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:35.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:36.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:37.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:38.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:39.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:39 np0005593295 ceph-mds[83039]: mds.beacon.cephfs.compute-2.prgzmm missed beacon ack from the monitors
Jan 23 05:10:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:10:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3570 writes, 20K keys, 3570 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.04 MB/s#012Cumulative WAL: 3569 writes, 3569 syncs, 1.00 writes per sync, written: 0.05 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1383 writes, 6411 keys, 1383 commit groups, 1.0 writes per commit group, ingest: 16.17 MB, 0.03 MB/s#012Interval WAL: 1382 writes, 1382 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     84.9      0.33              0.13         9    0.036       0      0       0.0       0.0#012  L6      1/0   12.81 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5     73.9     64.1      1.53              0.51         8    0.191     39K   4177       0.0       0.0#012 Sum      1/0   12.81 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     60.9     67.8      1.85              0.64        17    0.109     39K   4177       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.1     95.0     95.5      0.46              0.13         6    0.076     16K   1877       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     73.9     64.1      1.53              0.51         8    0.191     39K   4177       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     85.4      0.32              0.13         8    0.041       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.027, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.12 GB write, 0.10 MB/s write, 0.11 GB read, 0.09 MB/s read, 1.9 seconds#012Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.07 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 4.90 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000112 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(263,4.57 MB,1.50339%) FilterBlock(17,118.48 KB,0.0380616%) IndexBlock(17,221.48 KB,0.0711491%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:10:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:40 np0005593295 systemd-coredump[223493]: Process 202071 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 62:#012#0  0x00007f8000eeb32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f8000ef5900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 23 05:10:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).paxos(paxos updating c 1256..1981) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 0.590739429s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 23 05:10:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2[75767]: 2026-01-23T10:10:40.289+0000 7fdb29d57640 -1 mon.compute-2@1(peon).paxos(paxos updating c 1256..1981) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 0.590739429s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Jan 23 05:10:40 np0005593295 podman[223812]: 2026-01-23 10:10:40.302420944 +0000 UTC m=+5.649899342 container create f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:10:40 np0005593295 python3[223775]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Jan 23 05:10:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:40.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:40 np0005593295 systemd[1]: systemd-coredump@8-223492-0.service: Deactivated successfully.
Jan 23 05:10:40 np0005593295 systemd[1]: systemd-coredump@8-223492-0.service: Consumed 1.235s CPU time.
Jan 23 05:10:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:40 np0005593295 podman[223854]: 2026-01-23 10:10:40.405620111 +0000 UTC m=+0.024027583 container died 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:10:40 np0005593295 systemd[1]: var-lib-containers-storage-overlay-c2e895362fded70b70c158ad61887411a6935be2a9259c6a533cbeaa6d0ebd47-merged.mount: Deactivated successfully.
Jan 23 05:10:40 np0005593295 podman[223854]: 2026-01-23 10:10:40.446071414 +0000 UTC m=+0.064478866 container remove 075bd0906bffbd67deee1972885ea17223f9ea73d23664c030537d727fd0a3ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 23 05:10:40 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:10:40 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:10:40 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.722s CPU time.
Jan 23 05:10:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:41 np0005593295 python3.9[224051]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:41.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:41 np0005593295 python3.9[224230]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:10:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:42.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:43.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:44.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:44 np0005593295 python3.9[224382]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769163041.9439676-3389-84431096401087/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 05:10:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:44 np0005593295 python3.9[224461]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 05:10:44 np0005593295 systemd[1]: Reloading.
Jan 23 05:10:45 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:10:45 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:10:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:45.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101045 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:10:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:45 np0005593295 python3.9[224573]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 05:10:45 np0005593295 systemd[1]: Reloading.
Jan 23 05:10:46 np0005593295 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 05:10:46 np0005593295 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 05:10:46 np0005593295 systemd[1]: Starting nova_compute container...
Jan 23 05:10:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:46.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:46 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:10:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:46 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:46 np0005593295 podman[224614]: 2026-01-23 10:10:46.432064679 +0000 UTC m=+0.128288715 container init f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=nova_compute)
Jan 23 05:10:46 np0005593295 podman[224614]: 2026-01-23 10:10:46.436955969 +0000 UTC m=+0.133179985 container start f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + sudo -E kolla_set_configs
Jan 23 05:10:46 np0005593295 podman[224614]: nova_compute
Jan 23 05:10:46 np0005593295 systemd[1]: Started nova_compute container.
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Validating config file
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying service configuration files
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Deleting /etc/ceph
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Creating directory /etc/ceph
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Writing out command to execute
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:46 np0005593295 nova_compute[224630]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 05:10:46 np0005593295 nova_compute[224630]: ++ cat /run_command
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + CMD=nova-compute
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + ARGS=
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + sudo kolla_copy_cacerts
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + [[ ! -n '' ]]
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + . kolla_extend_start
Jan 23 05:10:46 np0005593295 nova_compute[224630]: Running command: 'nova-compute'
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + umask 0022
Jan 23 05:10:46 np0005593295 nova_compute[224630]: + exec nova-compute
Jan 23 05:10:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:47.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:47 np0005593295 python3.9[224792]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:48.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:48 np0005593295 python3.9[224944]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:49 np0005593295 nova_compute[224630]: 2026-01-23 10:10:49.051 224634 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:49 np0005593295 nova_compute[224630]: 2026-01-23 10:10:49.051 224634 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:49 np0005593295 nova_compute[224630]: 2026-01-23 10:10:49.052 224634 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:49 np0005593295 nova_compute[224630]: 2026-01-23 10:10:49.052 224634 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 23 05:10:49 np0005593295 nova_compute[224630]: 2026-01-23 10:10:49.193 224634 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:49 np0005593295 nova_compute[224630]: 2026-01-23 10:10:49.217 224634 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:49 np0005593295 nova_compute[224630]: 2026-01-23 10:10:49.217 224634 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:10:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:49.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:49 np0005593295 python3.9[225097]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 05:10:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.041 224634 INFO nova.virt.driver [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.175 224634 INFO nova.compute.provider_config [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.184 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.184 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.184 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.185 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.186 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.187 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.188 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.189 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.190 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.191 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.192 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.193 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.194 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.195 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.196 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.197 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.198 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.199 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.200 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.201 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.202 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.203 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.204 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.205 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.206 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.207 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.208 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.209 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.210 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.211 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.212 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.213 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.214 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.215 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.216 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.217 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.218 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.219 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.220 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.221 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.222 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.223 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.224 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.225 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.226 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.227 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.228 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.229 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.230 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.231 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.232 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.233 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.234 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.235 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.236 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.237 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.238 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.239 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.240 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.241 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.242 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.243 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.244 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.245 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.246 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.247 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.248 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.249 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.250 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.251 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.252 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.253 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.254 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.255 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 WARNING oslo_config.cfg [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 05:10:50 np0005593295 nova_compute[224630]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 05:10:50 np0005593295 nova_compute[224630]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 05:10:50 np0005593295 nova_compute[224630]: and ``live_migration_inbound_addr`` respectively.
Jan 23 05:10:50 np0005593295 nova_compute[224630]: ).  Its value may be silently ignored in the future.#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.256 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.257 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.258 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_secret_uuid        = f3005f84-239a-55b6-a948-8f1fb592b920 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.259 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.260 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.261 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.262 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.263 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.264 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.265 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.266 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.267 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.268 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.269 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.270 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.271 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.272 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.273 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.274 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.275 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.276 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.277 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.278 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.279 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.280 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.281 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.282 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.283 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.284 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.285 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.286 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.287 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.288 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.289 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.290 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.291 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.292 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.293 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.294 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.295 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.296 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.297 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.298 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.299 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.300 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.301 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.302 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.302 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.302 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.302 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.303 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.304 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.305 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.306 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.307 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.308 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.309 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.310 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.311 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.312 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.313 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.314 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.315 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.316 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.317 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.318 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.319 224634 DEBUG oslo_service.service [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.320 224634 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.337 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.338 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.338 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.338 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 23 05:10:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:50.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:50 np0005593295 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 05:10:50 np0005593295 systemd[1]: Started libvirt QEMU daemon.
Jan 23 05:10:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.415 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2f79e52760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.417 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2f79e52760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.418 224634 INFO nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.435 224634 WARNING nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 23 05:10:50 np0005593295 nova_compute[224630]: 2026-01-23 10:10:50.435 224634 DEBUG nova.virt.libvirt.volume.mount [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 23 05:10:50 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 9.
Jan 23 05:10:50 np0005593295 python3.9[225301]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 05:10:50 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:10:50 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.722s CPU time.
Jan 23 05:10:50 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:10:50 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:10:50 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:10:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:51 np0005593295 podman[225400]: 2026-01-23 10:10:50.918656323 +0000 UTC m=+0.020642779 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.278 224634 INFO nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <host>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <uuid>84c28ede-4112-4d76-8f99-c7405a7d029c</uuid>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <arch>x86_64</arch>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model>EPYC-Rome-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <vendor>AMD</vendor>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <microcode version='16777317'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <signature family='23' model='49' stepping='0'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='x2apic'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='tsc-deadline'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='osxsave'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='hypervisor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='tsc_adjust'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='spec-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='stibp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='arch-capabilities'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='cmp_legacy'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='topoext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='virt-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='lbrv'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='tsc-scale'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='vmcb-clean'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='pause-filter'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='pfthreshold'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='svme-addr-chk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='rdctl-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='skip-l1dfl-vmentry'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='mds-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature name='pschange-mc-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <pages unit='KiB' size='4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <pages unit='KiB' size='2048'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <pages unit='KiB' size='1048576'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <power_management>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <suspend_mem/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </power_management>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <iommu support='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <migration_features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <live/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <uri_transports>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <uri_transport>tcp</uri_transport>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <uri_transport>rdma</uri_transport>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </uri_transports>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </migration_features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <topology>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <cells num='1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <cell id='0'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:          <memory unit='KiB'>7864316</memory>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:          <pages unit='KiB' size='4'>1966079</pages>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:          <pages unit='KiB' size='2048'>0</pages>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:          <distances>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <sibling id='0' value='10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:          </distances>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:          <cpus num='8'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:          </cpus>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        </cell>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </cells>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </topology>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <cache>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </cache>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <secmodel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model>selinux</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <doi>0</doi>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </secmodel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <secmodel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model>dac</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <doi>0</doi>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </secmodel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </host>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <guest>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <os_type>hvm</os_type>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <arch name='i686'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <wordsize>32</wordsize>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <domain type='qemu'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <domain type='kvm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </arch>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <pae/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <nonpae/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <acpi default='on' toggle='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <apic default='on' toggle='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <cpuselection/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <deviceboot/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <disksnapshot default='on' toggle='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <externalSnapshot/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </guest>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <guest>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <os_type>hvm</os_type>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <arch name='x86_64'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <wordsize>64</wordsize>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <domain type='qemu'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <domain type='kvm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </arch>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <acpi default='on' toggle='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <apic default='on' toggle='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <cpuselection/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <deviceboot/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <disksnapshot default='on' toggle='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <externalSnapshot/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </guest>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 
Jan 23 05:10:51 np0005593295 nova_compute[224630]: </capabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: #033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.285 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.302 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 05:10:51 np0005593295 nova_compute[224630]: <domainCapabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <domain>kvm</domain>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <arch>i686</arch>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <vcpu max='240'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <iothreads supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <os supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <enum name='firmware'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <loader supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>rom</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pflash</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='readonly'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>yes</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>no</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='secure'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>no</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </loader>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </os>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>on</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>off</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='maximumMigratable'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>on</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>off</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <vendor>AMD</vendor>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='succor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='custom' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ddpd-u'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sha512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ddpd-u'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sha512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbpb'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbpb'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-128'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-256'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-128'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-256'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:51.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='KnightsMill'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512er'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512pf'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512er'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512pf'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tbm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tbm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='athlon'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='athlon-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='core2duo'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='core2duo-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='coreduo'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='coreduo-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='n270'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='n270-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='phenom'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='phenom-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <memoryBacking supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <enum name='sourceType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>file</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>anonymous</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>memfd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </memoryBacking>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <devices>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <disk supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='diskDevice'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>disk</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>cdrom</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>floppy</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>lun</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='bus'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ide</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>fdc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>scsi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>sata</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-non-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </disk>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <graphics supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vnc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>egl-headless</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dbus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </graphics>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <video supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='modelType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vga</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>cirrus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>none</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>bochs</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ramfb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </video>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <hostdev supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='mode'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>subsystem</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='startupPolicy'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>default</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>mandatory</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>requisite</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>optional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='subsysType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pci</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>scsi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='capsType'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='pciBackend'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </hostdev>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <rng supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-non-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>random</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>egd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>builtin</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </rng>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <filesystem supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='driverType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>path</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>handle</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtiofs</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </filesystem>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <tpm supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tpm-tis</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tpm-crb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>emulator</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>external</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendVersion'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>2.0</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </tpm>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <redirdev supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='bus'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </redirdev>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <channel supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pty</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>unix</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </channel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <crypto supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>qemu</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>builtin</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </crypto>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <interface supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>default</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>passt</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </interface>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <panic supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>isa</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>hyperv</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </panic>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <console supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>null</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pty</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dev</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>file</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pipe</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>stdio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>udp</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tcp</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>unix</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>qemu-vdagent</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dbus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </console>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </devices>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <gic supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <genid supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <backup supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <async-teardown supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <s390-pv supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <ps2 supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <tdx supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <sev supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <sgx supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <hyperv supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='features'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>relaxed</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vapic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>spinlocks</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vpindex</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>runtime</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>synic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>stimer</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>reset</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vendor_id</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>frequencies</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>reenlightenment</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tlbflush</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ipi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>avic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>emsr_bitmap</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>xmm_input</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <defaults>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </defaults>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </hyperv>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <launchSecurity supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: </domainCapabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.308 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 05:10:51 np0005593295 nova_compute[224630]: <domainCapabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <domain>kvm</domain>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <arch>i686</arch>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <vcpu max='4096'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <iothreads supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <os supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <enum name='firmware'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <loader supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>rom</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pflash</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='readonly'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>yes</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>no</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='secure'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>no</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </loader>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </os>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>on</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>off</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='maximumMigratable'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>on</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>off</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <vendor>AMD</vendor>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='succor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='custom' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ddpd-u'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sha512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ddpd-u'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sha512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbpb'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbpb'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-128'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-256'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-128'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-256'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='KnightsMill'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512er'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512pf'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512er'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512pf'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tbm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tbm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='athlon'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='athlon-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='core2duo'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='core2duo-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='coreduo'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='coreduo-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='n270'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='n270-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='phenom'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='phenom-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <memoryBacking supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <enum name='sourceType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>file</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>anonymous</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>memfd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </memoryBacking>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <devices>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <disk supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='diskDevice'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>disk</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>cdrom</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>floppy</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>lun</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='bus'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>fdc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>scsi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>sata</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-non-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </disk>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <graphics supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vnc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>egl-headless</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dbus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </graphics>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <video supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='modelType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vga</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>cirrus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>none</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>bochs</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ramfb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </video>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <hostdev supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='mode'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>subsystem</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='startupPolicy'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>default</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>mandatory</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>requisite</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>optional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='subsysType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pci</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>scsi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='capsType'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='pciBackend'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </hostdev>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <rng supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-non-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>random</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>egd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>builtin</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </rng>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <filesystem supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='driverType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>path</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>handle</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtiofs</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </filesystem>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <tpm supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tpm-tis</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tpm-crb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>emulator</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>external</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendVersion'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>2.0</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </tpm>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <redirdev supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='bus'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </redirdev>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <channel supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pty</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>unix</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </channel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <crypto supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>qemu</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>builtin</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </crypto>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <interface supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>default</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>passt</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </interface>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <panic supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>isa</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>hyperv</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </panic>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <console supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>null</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pty</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dev</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>file</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pipe</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>stdio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>udp</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tcp</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>unix</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>qemu-vdagent</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dbus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </console>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </devices>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <gic supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <genid supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <backup supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <async-teardown supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <s390-pv supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <ps2 supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <tdx supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <sev supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <sgx supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <hyperv supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='features'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>relaxed</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vapic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>spinlocks</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vpindex</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>runtime</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>synic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>stimer</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>reset</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vendor_id</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>frequencies</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>reenlightenment</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tlbflush</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ipi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>avic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>emsr_bitmap</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>xmm_input</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <defaults>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </defaults>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </hyperv>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <launchSecurity supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: </domainCapabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.358 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.362 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 05:10:51 np0005593295 nova_compute[224630]: <domainCapabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <domain>kvm</domain>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <arch>x86_64</arch>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <vcpu max='240'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <iothreads supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <os supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <enum name='firmware'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <loader supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>rom</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pflash</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='readonly'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>yes</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>no</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='secure'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>no</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </loader>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </os>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>on</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>off</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='maximumMigratable'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>on</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>off</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <vendor>AMD</vendor>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='succor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='custom' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ddpd-u'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sha512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ddpd-u'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sha512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbpb'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbpb'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-128'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-256'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-128'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-256'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='KnightsMill'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512er'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512pf'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512er'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512pf'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tbm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tbm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='athlon'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='athlon-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='core2duo'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='core2duo-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='coreduo'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='coreduo-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='n270'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='n270-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='phenom'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='phenom-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <memoryBacking supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <enum name='sourceType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>file</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>anonymous</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>memfd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </memoryBacking>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <devices>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <disk supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='diskDevice'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>disk</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>cdrom</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>floppy</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>lun</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='bus'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ide</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>fdc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>scsi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>sata</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-non-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </disk>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <graphics supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vnc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>egl-headless</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dbus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </graphics>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <video supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='modelType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vga</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>cirrus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>none</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>bochs</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ramfb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </video>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <hostdev supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='mode'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>subsystem</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='startupPolicy'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>default</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>mandatory</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>requisite</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>optional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='subsysType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pci</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>scsi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='capsType'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='pciBackend'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </hostdev>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <rng supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-non-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>random</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>egd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>builtin</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </rng>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <filesystem supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='driverType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>path</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>handle</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtiofs</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </filesystem>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <tpm supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tpm-tis</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tpm-crb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>emulator</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>external</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendVersion'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>2.0</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </tpm>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <redirdev supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='bus'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </redirdev>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <channel supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pty</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>unix</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </channel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <crypto supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>qemu</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>builtin</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </crypto>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <interface supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>default</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>passt</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </interface>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <panic supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>isa</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>hyperv</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </panic>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <console supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>null</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pty</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dev</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>file</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pipe</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>stdio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>udp</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tcp</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>unix</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>qemu-vdagent</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dbus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </console>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </devices>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <gic supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <genid supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <backup supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <async-teardown supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <s390-pv supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <ps2 supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <tdx supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <sev supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <sgx supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <hyperv supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='features'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>relaxed</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vapic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>spinlocks</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vpindex</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>runtime</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>synic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>stimer</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>reset</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vendor_id</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>frequencies</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>reenlightenment</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tlbflush</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ipi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>avic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>emsr_bitmap</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>xmm_input</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <defaults>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </defaults>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </hyperv>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <launchSecurity supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: </domainCapabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.438 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 05:10:51 np0005593295 nova_compute[224630]: <domainCapabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <domain>kvm</domain>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <arch>x86_64</arch>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <vcpu max='4096'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <iothreads supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <os supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <enum name='firmware'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>efi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <loader supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>rom</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pflash</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='readonly'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>yes</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>no</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='secure'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>yes</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>no</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </loader>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </os>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>on</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>off</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='maximumMigratable'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>on</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>off</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <vendor>AMD</vendor>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='succor'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <mode name='custom' supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ddpd-u'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sha512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ddpd-u'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sha512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm3'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sm4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Denverton-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbpb'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amd-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='auto-ibrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='perfmon-v2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbpb'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='stibp-always-on'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='EPYC-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-128'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-256'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-128'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-256'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx10-512'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='prefetchiti'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Haswell-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='KnightsMill'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512er'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512pf'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512er'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512pf'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tbm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fma4'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tbm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xop'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='amx-tile'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-bf16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-fp16'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bitalg'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrc'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fzrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='la57'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='taa-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ifma'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cmpccxadd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fbsdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='fsrs'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ibrs-all'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='intel-psfd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='lam'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mcdt-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pbrsb-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='psdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='serialize'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vaes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='hle'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='rtm'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512bw'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512cd'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512dq'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512f'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='avx512vl'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='invpcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pcid'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='pku'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='mpx'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='core-capability'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='split-lock-detect'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='cldemote'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='erms'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='gfni'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdir64b'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='movdiri'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='xsaves'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='athlon'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='athlon-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='core2duo'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='core2duo-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='coreduo'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='coreduo-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='n270'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='n270-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='ss'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='phenom'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <blockers model='phenom-v1'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnow'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <feature name='3dnowext'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </blockers>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </mode>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </cpu>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <memoryBacking supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <enum name='sourceType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>file</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>anonymous</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <value>memfd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </memoryBacking>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <devices>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <disk supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='diskDevice'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>disk</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>cdrom</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>floppy</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>lun</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='bus'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>fdc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>scsi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>sata</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-non-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </disk>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <graphics supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vnc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>egl-headless</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dbus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </graphics>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <video supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='modelType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vga</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>cirrus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>none</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>bochs</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ramfb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </video>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <hostdev supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='mode'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>subsystem</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='startupPolicy'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>default</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>mandatory</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>requisite</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>optional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='subsysType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pci</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>scsi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='capsType'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='pciBackend'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </hostdev>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <rng supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtio-non-transitional</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>random</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>egd</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>builtin</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </rng>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <filesystem supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='driverType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>path</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>handle</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>virtiofs</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </filesystem>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <tpm supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tpm-tis</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tpm-crb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>emulator</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>external</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendVersion'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>2.0</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </tpm>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <redirdev supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='bus'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>usb</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </redirdev>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <channel supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pty</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>unix</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </channel>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <crypto supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>qemu</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendModel'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>builtin</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </crypto>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <interface supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='backendType'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>default</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>passt</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </interface>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <panic supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='model'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>isa</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>hyperv</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </panic>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <console supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='type'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>null</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vc</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pty</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dev</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>file</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>pipe</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>stdio</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>udp</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tcp</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>unix</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>qemu-vdagent</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>dbus</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </console>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </devices>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  <features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <gic supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <genid supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <backup supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <async-teardown supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <s390-pv supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <ps2 supported='yes'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <tdx supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <sev supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <sgx supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <hyperv supported='yes'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <enum name='features'>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>relaxed</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vapic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>spinlocks</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vpindex</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>runtime</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>synic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>stimer</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>reset</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>vendor_id</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>frequencies</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>reenlightenment</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>tlbflush</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>ipi</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>avic</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>emsr_bitmap</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <value>xmm_input</value>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </enum>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      <defaults>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:      </defaults>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    </hyperv>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:    <launchSecurity supported='no'/>
Jan 23 05:10:51 np0005593295 nova_compute[224630]:  </features>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: </domainCapabilities>
Jan 23 05:10:51 np0005593295 nova_compute[224630]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.516 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.516 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.516 224634 DEBUG nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.521 224634 INFO nova.virt.libvirt.host [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Secure Boot support detected#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.524 224634 INFO nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.524 224634 INFO nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.532 224634 DEBUG nova.virt.libvirt.driver [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.560 224634 INFO nova.virt.node [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Determined node identity db762d15-510c-4120-bfc4-afe76b90b657 from /var/lib/nova/compute_id#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.578 224634 WARNING nova.compute.manager [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Compute nodes ['db762d15-510c-4120-bfc4-afe76b90b657'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.605 224634 INFO nova.compute.manager [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.637 224634 WARNING nova.compute.manager [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.638 224634 DEBUG oslo_concurrency.lockutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.638 224634 DEBUG oslo_concurrency.lockutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.638 224634 DEBUG oslo_concurrency.lockutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.638 224634 DEBUG nova.compute.resource_tracker [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:10:51 np0005593295 nova_compute[224630]: 2026-01-23 10:10:51.639 224634 DEBUG oslo_concurrency.processutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:51 np0005593295 python3.9[225550]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 05:10:51 np0005593295 systemd[1]: Stopping nova_compute container...
Jan 23 05:10:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:52.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:52 np0005593295 podman[225400]: 2026-01-23 10:10:52.64089142 +0000 UTC m=+1.742877846 container create 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 05:10:52 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:52 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:52 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:52 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7d889ca7155de289c02ef8a64720d2cb4293985fa9132c6bb9ef15a832b68d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:52 np0005593295 podman[225400]: 2026-01-23 10:10:52.937327288 +0000 UTC m=+2.039313734 container init 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 05:10:52 np0005593295 podman[225400]: 2026-01-23 10:10:52.945141159 +0000 UTC m=+2.047127585 container start 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 05:10:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:10:52 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1198102383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:52 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:10:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:52 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:10:53 np0005593295 nova_compute[224630]: 2026-01-23 10:10:53.008 224634 DEBUG oslo_concurrency.processutils [None req-eefcfb8c-2051-486b-86b0-8eb507b28774 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.369s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:53 np0005593295 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 05:10:53 np0005593295 bash[225400]: 7bf0ac2a3b0db2b226ff7b02cceefaa8070d70f7dbf7b4bd95e30e956430f2e0
Jan 23 05:10:53 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:10:53 np0005593295 systemd[1]: Started libvirt nodedev daemon.
Jan 23 05:10:53 np0005593295 nova_compute[224630]: 2026-01-23 10:10:53.159 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:53 np0005593295 nova_compute[224630]: 2026-01-23 10:10:53.160 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:53 np0005593295 nova_compute[224630]: 2026-01-23 10:10:53.160 224634 DEBUG oslo_concurrency.lockutils [None req-103e6137-bf52-47ad-b0c1-31d508ba85c9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:53.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:53 np0005593295 virtqemud[225221]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 23 05:10:53 np0005593295 virtqemud[225221]: hostname: compute-2
Jan 23 05:10:53 np0005593295 virtqemud[225221]: End of file while reading data: Input/output error
Jan 23 05:10:53 np0005593295 systemd[1]: libpod-f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e.scope: Deactivated successfully.
Jan 23 05:10:53 np0005593295 systemd[1]: libpod-f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e.scope: Consumed 3.626s CPU time.
Jan 23 05:10:53 np0005593295 podman[225565]: 2026-01-23 10:10:53.563129742 +0000 UTC m=+1.712954071 container died f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:10:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:10:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:10:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:10:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:10:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:53 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:10:54 np0005593295 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e-userdata-shm.mount: Deactivated successfully.
Jan 23 05:10:54 np0005593295 systemd[1]: var-lib-containers-storage-overlay-e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d-merged.mount: Deactivated successfully.
Jan 23 05:10:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:54 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:10:54 np0005593295 podman[225565]: 2026-01-23 10:10:54.140308421 +0000 UTC m=+2.290132690 container cleanup f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Jan 23 05:10:54 np0005593295 podman[225565]: nova_compute
Jan 23 05:10:54 np0005593295 podman[225673]: nova_compute
Jan 23 05:10:54 np0005593295 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 23 05:10:54 np0005593295 systemd[1]: Stopped nova_compute container.
Jan 23 05:10:54 np0005593295 systemd[1]: Starting nova_compute container...
Jan 23 05:10:54 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:10:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c2a27604130ae0c99245d8cc90e749391c91a55d1093354a2b3deb7500644d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:54 np0005593295 podman[225686]: 2026-01-23 10:10:54.320313906 +0000 UTC m=+0.096121034 container init f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:10:54 np0005593295 podman[225686]: 2026-01-23 10:10:54.331249634 +0000 UTC m=+0.107056752 container start f99dfdec5ed1b13c4bc030b38f1e4e63000d9ea357b09a559200a7501b063f8e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Jan 23 05:10:54 np0005593295 podman[225686]: nova_compute
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + sudo -E kolla_set_configs
Jan 23 05:10:54 np0005593295 systemd[1]: Started nova_compute container.
Jan 23 05:10:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:54.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Validating config file
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying service configuration files
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /etc/ceph
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Creating directory /etc/ceph
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Writing out command to execute
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:54 np0005593295 nova_compute[225701]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 05:10:54 np0005593295 nova_compute[225701]: ++ cat /run_command
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + CMD=nova-compute
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + ARGS=
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + sudo kolla_copy_cacerts
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + [[ ! -n '' ]]
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + . kolla_extend_start
Jan 23 05:10:54 np0005593295 nova_compute[225701]: Running command: 'nova-compute'
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + umask 0022
Jan 23 05:10:54 np0005593295 nova_compute[225701]: + exec nova-compute
Jan 23 05:10:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:55.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:55 np0005593295 python3.9[225865]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 05:10:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:10:55.472 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:10:55.473 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:10:55.474 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:55 np0005593295 systemd[1]: Started libpod-conmon-ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c.scope.
Jan 23 05:10:55 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:10:55 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f22e8840e86e25e717c359cb474b35854cdc3e93e9623e9d87c066db60f0155/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:55 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f22e8840e86e25e717c359cb474b35854cdc3e93e9623e9d87c066db60f0155/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:55 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f22e8840e86e25e717c359cb474b35854cdc3e93e9623e9d87c066db60f0155/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 23 05:10:55 np0005593295 podman[225889]: 2026-01-23 10:10:55.560347709 +0000 UTC m=+0.123913446 container init ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:10:55 np0005593295 podman[225889]: 2026-01-23 10:10:55.56766507 +0000 UTC m=+0.131230787 container start ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:10:55 np0005593295 python3.9[225865]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Applying nova statedir ownership
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 23 05:10:55 np0005593295 nova_compute_init[225911]: INFO:nova_statedir:Nova statedir ownership complete
Jan 23 05:10:55 np0005593295 systemd[1]: libpod-ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c.scope: Deactivated successfully.
Jan 23 05:10:55 np0005593295 podman[225912]: 2026-01-23 10:10:55.624311952 +0000 UTC m=+0.026961454 container died ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 23 05:10:55 np0005593295 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c-userdata-shm.mount: Deactivated successfully.
Jan 23 05:10:55 np0005593295 systemd[1]: var-lib-containers-storage-overlay-3f22e8840e86e25e717c359cb474b35854cdc3e93e9623e9d87c066db60f0155-merged.mount: Deactivated successfully.
Jan 23 05:10:55 np0005593295 podman[225923]: 2026-01-23 10:10:55.680435472 +0000 UTC m=+0.047145921 container cleanup ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:10:55 np0005593295 systemd[1]: libpod-conmon-ab35065d5c90da8f59fd3b2ff3626e82eb030c0e98065938576df5f518cd313c.scope: Deactivated successfully.
Jan 23 05:10:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:10:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:56.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:10:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:56 np0005593295 nova_compute[225701]: 2026-01-23 10:10:56.617 225706 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:56 np0005593295 nova_compute[225701]: 2026-01-23 10:10:56.618 225706 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:56 np0005593295 nova_compute[225701]: 2026-01-23 10:10:56.618 225706 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 05:10:56 np0005593295 nova_compute[225701]: 2026-01-23 10:10:56.618 225706 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 23 05:10:56 np0005593295 systemd[1]: session-53.scope: Deactivated successfully.
Jan 23 05:10:56 np0005593295 systemd[1]: session-53.scope: Consumed 1min 57.570s CPU time.
Jan 23 05:10:56 np0005593295 systemd-logind[786]: Session 53 logged out. Waiting for processes to exit.
Jan 23 05:10:56 np0005593295 systemd-logind[786]: Removed session 53.
Jan 23 05:10:56 np0005593295 nova_compute[225701]: 2026-01-23 10:10:56.750 225706 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:56 np0005593295 nova_compute[225701]: 2026-01-23 10:10:56.773 225706 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:56 np0005593295 nova_compute[225701]: 2026-01-23 10:10:56.773 225706 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:10:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.204 225706 INFO nova.virt.driver [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.304 225706 INFO nova.compute.provider_config [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.313 225706 DEBUG oslo_concurrency.lockutils [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.313 225706 DEBUG oslo_concurrency.lockutils [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.313 225706 DEBUG oslo_concurrency.lockutils [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.314 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.315 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.316 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.317 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.318 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.319 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.320 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.321 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.322 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.323 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.324 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.325 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.326 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.327 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.328 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.329 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.330 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.331 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:57.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.332 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.333 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.334 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.335 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.336 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.337 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.338 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.339 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.340 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.341 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.342 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.343 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.344 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.345 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.346 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.347 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.348 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.349 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.350 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.351 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.352 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.353 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.354 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.355 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.356 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.357 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.358 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.359 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.360 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.361 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.362 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.363 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.364 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.365 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.366 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.366 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.366 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.366 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.367 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.368 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.369 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.370 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.371 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.372 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.373 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.374 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.375 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.376 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.377 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.378 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.379 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.379 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.379 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.379 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.380 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.381 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.382 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.383 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.384 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.385 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.386 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.387 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.388 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.388 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.388 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.388 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.389 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.390 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.391 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.392 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.392 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.392 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.392 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.393 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.393 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.393 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.393 225706 WARNING oslo_config.cfg [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 05:10:57 np0005593295 nova_compute[225701]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 05:10:57 np0005593295 nova_compute[225701]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 05:10:57 np0005593295 nova_compute[225701]: and ``live_migration_inbound_addr`` respectively.
Jan 23 05:10:57 np0005593295 nova_compute[225701]: ).  Its value may be silently ignored in the future.#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.394 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.395 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.395 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.395 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.395 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.396 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.397 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_secret_uuid        = f3005f84-239a-55b6-a948-8f1fb592b920 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.398 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.398 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.398 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.398 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.399 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.400 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.400 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.400 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.400 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.401 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.402 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.402 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.402 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.402 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.403 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.404 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.405 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.405 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.405 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.405 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.406 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.407 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.408 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.409 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.409 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.409 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.409 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.410 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.411 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.412 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.413 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.413 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.413 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.413 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.414 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.415 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.416 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.417 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.417 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.417 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.417 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.418 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.419 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.420 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.421 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.421 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.421 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.421 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.422 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.423 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.423 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.423 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.424 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.425 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.426 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.426 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.426 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.426 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.427 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.427 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.427 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.427 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.428 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.429 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.430 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.431 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.432 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.432 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.432 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.432 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.433 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.434 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.435 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.436 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.436 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.436 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.436 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.437 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.438 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.439 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.440 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.441 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.442 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.442 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.442 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.442 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.443 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.443 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.443 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.443 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.444 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.445 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.446 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.446 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.446 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.446 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.447 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.447 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.447 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.448 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.449 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.450 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.450 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.450 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.450 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.451 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.452 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.452 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.452 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.452 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.453 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.454 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.455 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.455 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.455 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.455 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.456 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.457 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.457 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.457 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.457 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.458 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.459 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.459 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.459 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.459 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.460 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.461 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.462 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.462 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.462 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.462 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.463 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.464 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.464 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.464 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.464 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.465 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.465 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.465 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.466 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.466 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.466 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.466 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.467 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.468 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.469 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.470 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.471 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.472 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.473 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.474 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.474 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.474 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.474 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.475 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.476 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.476 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.476 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.476 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.477 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.478 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.478 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.478 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.478 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.479 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.480 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.480 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.480 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.480 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.481 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.482 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.483 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.495 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.496 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.496 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.496 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.496 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.497 225706 DEBUG oslo_service.service [None req-f997fefa-9e8a-4827-902b-953b7633da04 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.499 225706 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.515 225706 INFO nova.virt.node [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Determined node identity db762d15-510c-4120-bfc4-afe76b90b657 from /var/lib/nova/compute_id#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.516 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.516 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.517 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.517 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.534 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f563b9cf5b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.537 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f563b9cf5b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.538 225706 INFO nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.546 225706 INFO nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <host>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <uuid>84c28ede-4112-4d76-8f99-c7405a7d029c</uuid>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <arch>x86_64</arch>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <microcode version='16777317'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <signature family='23' model='49' stepping='0'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='x2apic'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='tsc-deadline'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='osxsave'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='hypervisor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='tsc_adjust'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='spec-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='stibp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='arch-capabilities'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='cmp_legacy'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='topoext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='virt-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='lbrv'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='tsc-scale'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='vmcb-clean'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='pause-filter'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='pfthreshold'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='rdctl-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='skip-l1dfl-vmentry'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='mds-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature name='pschange-mc-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <pages unit='KiB' size='4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <pages unit='KiB' size='2048'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <pages unit='KiB' size='1048576'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <power_management>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <suspend_mem/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </power_management>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <iommu support='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <migration_features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <live/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <uri_transports>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <uri_transport>tcp</uri_transport>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <uri_transport>rdma</uri_transport>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </uri_transports>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </migration_features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <topology>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <cells num='1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <cell id='0'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:          <memory unit='KiB'>7864316</memory>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:          <pages unit='KiB' size='4'>1966079</pages>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:          <pages unit='KiB' size='2048'>0</pages>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:          <distances>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <sibling id='0' value='10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:          </distances>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:          <cpus num='8'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:          </cpus>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        </cell>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </cells>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </topology>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <cache>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </cache>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <secmodel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model>selinux</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <doi>0</doi>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </secmodel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <secmodel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model>dac</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <doi>0</doi>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </secmodel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </host>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <guest>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <os_type>hvm</os_type>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <arch name='i686'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <wordsize>32</wordsize>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <domain type='qemu'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <domain type='kvm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </arch>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <pae/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <nonpae/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <acpi default='on' toggle='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <apic default='on' toggle='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <cpuselection/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <deviceboot/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <disksnapshot default='on' toggle='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <externalSnapshot/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </guest>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <guest>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <os_type>hvm</os_type>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <arch name='x86_64'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <wordsize>64</wordsize>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <domain type='qemu'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <domain type='kvm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </arch>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <acpi default='on' toggle='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <apic default='on' toggle='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <cpuselection/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <deviceboot/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <disksnapshot default='on' toggle='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <externalSnapshot/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </guest>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 
Jan 23 05:10:57 np0005593295 nova_compute[225701]: </capabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: #033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.554 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.561 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 05:10:57 np0005593295 nova_compute[225701]: <domainCapabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <domain>kvm</domain>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <arch>i686</arch>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <vcpu max='4096'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <iothreads supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <os supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <enum name='firmware'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <loader supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>rom</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pflash</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='readonly'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>yes</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>no</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='secure'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>no</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </loader>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </os>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>on</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>off</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='maximumMigratable'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>on</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>off</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='succor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='custom' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:57 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:10:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[225589]: 23/01/2026 10:10:57 : epoch 6973492c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='KnightsMill'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='athlon'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='athlon-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='core2duo'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='core2duo-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='coreduo'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='coreduo-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='n270'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='n270-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='phenom'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='phenom-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <memoryBacking supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <enum name='sourceType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>file</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>anonymous</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>memfd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </memoryBacking>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <devices>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <disk supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='diskDevice'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>disk</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>cdrom</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>floppy</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>lun</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='bus'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>fdc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>scsi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>sata</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <graphics supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vnc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>egl-headless</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dbus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </graphics>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <video supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='modelType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vga</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>cirrus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>none</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>bochs</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ramfb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </video>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <hostdev supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='mode'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>subsystem</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='startupPolicy'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>default</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>mandatory</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>requisite</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>optional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='subsysType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pci</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>scsi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='capsType'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='pciBackend'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </hostdev>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <rng supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>random</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>egd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>builtin</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </rng>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <filesystem supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='driverType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>path</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>handle</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtiofs</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </filesystem>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <tpm supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tpm-tis</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tpm-crb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>emulator</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>external</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendVersion'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>2.0</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </tpm>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <redirdev supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='bus'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </redirdev>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <channel supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pty</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>unix</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </channel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <crypto supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>qemu</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>builtin</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </crypto>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <interface supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>default</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>passt</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </interface>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <panic supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>isa</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>hyperv</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </panic>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <console supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>null</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pty</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dev</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>file</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pipe</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>stdio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>udp</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tcp</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>unix</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>qemu-vdagent</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dbus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </console>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </devices>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <gic supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <genid supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <backup supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <async-teardown supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <s390-pv supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <ps2 supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <tdx supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <sev supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <sgx supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <hyperv supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='features'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>relaxed</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vapic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>spinlocks</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vpindex</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>runtime</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>synic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>stimer</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>reset</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vendor_id</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>frequencies</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>reenlightenment</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tlbflush</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ipi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>avic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>emsr_bitmap</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>xmm_input</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <defaults>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </defaults>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </hyperv>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <launchSecurity supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: </domainCapabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.571 225706 DEBUG nova.virt.libvirt.volume.mount [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.576 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 05:10:57 np0005593295 nova_compute[225701]: <domainCapabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <domain>kvm</domain>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <arch>i686</arch>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <vcpu max='240'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <iothreads supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <os supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <enum name='firmware'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <loader supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>rom</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pflash</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='readonly'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>yes</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>no</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='secure'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>no</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </loader>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </os>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>on</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>off</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='maximumMigratable'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>on</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>off</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='succor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='custom' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='KnightsMill'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='athlon'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='athlon-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='core2duo'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='core2duo-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='coreduo'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='coreduo-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='n270'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='n270-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='phenom'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='phenom-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <memoryBacking supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <enum name='sourceType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>file</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>anonymous</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>memfd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </memoryBacking>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <devices>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <disk supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='diskDevice'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>disk</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>cdrom</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>floppy</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>lun</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='bus'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ide</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>fdc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>scsi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>sata</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <graphics supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vnc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>egl-headless</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dbus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </graphics>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <video supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='modelType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vga</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>cirrus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>none</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>bochs</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ramfb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </video>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <hostdev supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='mode'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>subsystem</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='startupPolicy'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>default</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>mandatory</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>requisite</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>optional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='subsysType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pci</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>scsi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='capsType'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='pciBackend'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </hostdev>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <rng supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>random</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>egd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>builtin</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </rng>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <filesystem supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='driverType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>path</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>handle</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtiofs</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </filesystem>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <tpm supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tpm-tis</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tpm-crb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>emulator</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>external</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendVersion'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>2.0</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </tpm>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <redirdev supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='bus'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </redirdev>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <channel supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pty</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>unix</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </channel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <crypto supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>qemu</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>builtin</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </crypto>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <interface supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>default</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>passt</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </interface>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <panic supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>isa</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>hyperv</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </panic>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <console supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>null</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pty</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dev</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>file</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pipe</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>stdio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>udp</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tcp</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>unix</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>qemu-vdagent</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dbus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </console>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </devices>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <gic supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <genid supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <backup supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <async-teardown supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <s390-pv supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <ps2 supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <tdx supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <sev supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <sgx supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <hyperv supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='features'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>relaxed</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vapic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>spinlocks</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vpindex</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>runtime</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>synic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>stimer</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>reset</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vendor_id</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>frequencies</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>reenlightenment</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tlbflush</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ipi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>avic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>emsr_bitmap</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>xmm_input</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <defaults>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </defaults>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </hyperv>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <launchSecurity supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: </domainCapabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.664 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.669 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 05:10:57 np0005593295 nova_compute[225701]: <domainCapabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <domain>kvm</domain>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <arch>x86_64</arch>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <vcpu max='4096'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <iothreads supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <os supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <enum name='firmware'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>efi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <loader supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>rom</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pflash</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='readonly'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>yes</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>no</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='secure'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>yes</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>no</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </loader>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </os>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>on</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>off</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='maximumMigratable'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>on</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>off</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='succor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='custom' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='KnightsMill'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='athlon'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='athlon-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='core2duo'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='core2duo-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='coreduo'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='coreduo-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='n270'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='n270-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='phenom'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='phenom-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <memoryBacking supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <enum name='sourceType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>file</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>anonymous</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>memfd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </memoryBacking>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <devices>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <disk supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='diskDevice'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>disk</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>cdrom</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>floppy</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>lun</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='bus'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>fdc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>scsi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>sata</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <graphics supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vnc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>egl-headless</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dbus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </graphics>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <video supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='modelType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vga</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>cirrus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>none</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>bochs</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ramfb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </video>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <hostdev supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='mode'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>subsystem</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='startupPolicy'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>default</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>mandatory</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>requisite</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>optional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='subsysType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pci</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>scsi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='capsType'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='pciBackend'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </hostdev>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <rng supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>random</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>egd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>builtin</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </rng>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <filesystem supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='driverType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>path</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>handle</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtiofs</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </filesystem>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <tpm supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tpm-tis</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tpm-crb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>emulator</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>external</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendVersion'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>2.0</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </tpm>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <redirdev supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='bus'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </redirdev>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <channel supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pty</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>unix</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </channel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <crypto supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>qemu</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>builtin</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </crypto>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <interface supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>default</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>passt</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </interface>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <panic supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>isa</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>hyperv</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </panic>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <console supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>null</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pty</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dev</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>file</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pipe</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>stdio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>udp</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tcp</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>unix</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>qemu-vdagent</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dbus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </console>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </devices>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <gic supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <genid supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <backup supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <async-teardown supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <s390-pv supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <ps2 supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <tdx supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <sev supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <sgx supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <hyperv supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='features'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>relaxed</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vapic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>spinlocks</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vpindex</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>runtime</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>synic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>stimer</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>reset</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vendor_id</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>frequencies</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>reenlightenment</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tlbflush</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ipi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>avic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>emsr_bitmap</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>xmm_input</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <defaults>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </defaults>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </hyperv>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <launchSecurity supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: </domainCapabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.752 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 05:10:57 np0005593295 nova_compute[225701]: <domainCapabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <domain>kvm</domain>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <arch>x86_64</arch>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <vcpu max='240'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <iothreads supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <os supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <enum name='firmware'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <loader supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>rom</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pflash</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='readonly'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>yes</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>no</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='secure'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>no</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </loader>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </os>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='host-passthrough' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='hostPassthroughMigratable'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>on</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>off</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='maximum' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='maximumMigratable'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>on</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>off</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='host-model' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <vendor>AMD</vendor>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='x2apic'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='hypervisor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='stibp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='overflow-recov'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='succor'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='lbrv'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='tsc-scale'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='flushbyasid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='pause-filter'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='pfthreshold'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <feature policy='disable' name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <mode name='custom' supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Broadwell-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='ClearwaterForest'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='ClearwaterForest-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ddpd-u'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sha512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm3'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sm4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Cooperlake-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Denverton-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Dhyana-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Milan-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Rome-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Turin'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-Turin-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amd-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='auto-ibrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vp2intersect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fs-gs-base-ns'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibpb-brtype'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='no-nested-data-bp'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='null-sel-clr-base'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='perfmon-v2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbpb'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='srso-user-kernel-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='stibp-always-on'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='EPYC-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='GraniteRapids-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-128'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-256'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx10-512'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='prefetchiti'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Haswell-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v6'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Icelake-Server-v7'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='IvyBridge-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='KnightsMill'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='KnightsMill-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4fmaps'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-4vnniw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512er'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512pf'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G4-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Opteron_G5-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fma4'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tbm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xop'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SapphireRapids-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='amx-tile'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-bf16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-fp16'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512-vpopcntdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bitalg'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vbmi2'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrc'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fzrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='la57'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='taa-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='tsx-ldtrk'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='SierraForest-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ifma'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-ne-convert'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx-vnni-int8'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bhi-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='bus-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cmpccxadd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fbsdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='fsrs'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ibrs-all'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='intel-psfd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ipred-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='lam'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mcdt-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pbrsb-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='psdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rrsba-ctrl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='sbdr-ssdp-no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='serialize'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vaes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='vpclmulqdq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Client-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='hle'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='rtm'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Skylake-Server-v5'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512bw'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512cd'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512dq'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512f'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='avx512vl'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='invpcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pcid'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='pku'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='mpx'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v2'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v3'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='core-capability'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='split-lock-detect'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='Snowridge-v4'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='cldemote'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='erms'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='gfni'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdir64b'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='movdiri'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='xsaves'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='athlon'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='athlon-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='core2duo'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='core2duo-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='coreduo'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='coreduo-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='n270'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='n270-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='ss'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='phenom'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <blockers model='phenom-v1'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnow'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <feature name='3dnowext'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </blockers>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </mode>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </cpu>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <memoryBacking supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <enum name='sourceType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>file</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>anonymous</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <value>memfd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </memoryBacking>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <devices>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <disk supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='diskDevice'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>disk</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>cdrom</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>floppy</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>lun</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='bus'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ide</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>fdc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>scsi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>sata</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <graphics supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vnc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>egl-headless</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dbus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </graphics>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <video supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='modelType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vga</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>cirrus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>none</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>bochs</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ramfb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </video>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <hostdev supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='mode'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>subsystem</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='startupPolicy'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>default</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>mandatory</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>requisite</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>optional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='subsysType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pci</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>scsi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='capsType'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='pciBackend'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </hostdev>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <rng supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtio-non-transitional</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>random</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>egd</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>builtin</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </rng>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <filesystem supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='driverType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>path</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>handle</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>virtiofs</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </filesystem>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <tpm supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tpm-tis</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tpm-crb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>emulator</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>external</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendVersion'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>2.0</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </tpm>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <redirdev supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='bus'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>usb</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </redirdev>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <channel supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pty</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>unix</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </channel>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <crypto supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>qemu</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendModel'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>builtin</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </crypto>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <interface supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='backendType'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>default</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>passt</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </interface>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <panic supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='model'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>isa</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>hyperv</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </panic>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <console supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='type'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>null</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vc</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pty</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dev</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>file</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>pipe</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>stdio</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>udp</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tcp</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>unix</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>qemu-vdagent</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>dbus</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </console>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </devices>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  <features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <gic supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <vmcoreinfo supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <genid supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <backingStoreInput supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <backup supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <async-teardown supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <s390-pv supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <ps2 supported='yes'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <tdx supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <sev supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <sgx supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <hyperv supported='yes'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <enum name='features'>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>relaxed</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vapic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>spinlocks</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vpindex</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>runtime</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>synic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>stimer</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>reset</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>vendor_id</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>frequencies</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>reenlightenment</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>tlbflush</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>ipi</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>avic</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>emsr_bitmap</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <value>xmm_input</value>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </enum>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      <defaults>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <spinlocks>4095</spinlocks>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <stimer_direct>on</stimer_direct>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:      </defaults>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    </hyperv>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:    <launchSecurity supported='no'/>
Jan 23 05:10:57 np0005593295 nova_compute[225701]:  </features>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: </domainCapabilities>
Jan 23 05:10:57 np0005593295 nova_compute[225701]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.832 225706 DEBUG nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.832 225706 INFO nova.virt.libvirt.host [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Secure Boot support detected#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.834 225706 INFO nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.835 225706 INFO nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.844 225706 DEBUG nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.860 225706 INFO nova.virt.node [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Determined node identity db762d15-510c-4120-bfc4-afe76b90b657 from /var/lib/nova/compute_id#033[00m
Jan 23 05:10:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:10:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.872 225706 WARNING nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Compute nodes ['db762d15-510c-4120-bfc4-afe76b90b657'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.893 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.912 225706 WARNING nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.912 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.912 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.913 225706 DEBUG oslo_concurrency.lockutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.913 225706 DEBUG nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:10:57 np0005593295 nova_compute[225701]: 2026-01-23 10:10:57.913 225706 DEBUG oslo_concurrency.processutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:10:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:10:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:58.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:10:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:10:58 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/758454433' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:58 np0005593295 nova_compute[225701]: 2026-01-23 10:10:58.381 225706 DEBUG oslo_concurrency.processutils [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:10:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:10:58 np0005593295 nova_compute[225701]: 2026-01-23 10:10:58.531 225706 WARNING nova.virt.libvirt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:10:58 np0005593295 nova_compute[225701]: 2026-01-23 10:10:58.532 225706 DEBUG nova.compute.resource_tracker [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5271MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:18:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:18:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:52.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:52 np0005593295 rsyslogd[1004]: imjournal: 5028 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 23 05:18:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:52 np0005593295 nova_compute[225701]: 2026-01-23 10:18:52.869 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:18:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:18:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:52.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:18:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:53 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004130 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:18:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:54.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:54 np0005593295 nova_compute[225701]: 2026-01-23 10:18:54.185 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:18:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:54.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:55 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:18:55.484 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:18:55.484 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:18:55.484 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:18:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:56.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004150 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:18:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:18:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:18:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:56.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:18:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:57 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:57 np0005593295 nova_compute[225701]: 2026-01-23 10:18:57.869 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:18:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:18:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:58.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:18:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:58 np0005593295 nova_compute[225701]: 2026-01-23 10:18:58.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:18:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:18:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:58.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:18:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:18:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:18:59 np0005593295 nova_compute[225701]: 2026-01-23 10:18:59.189 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:18:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:18:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:59 np0005593295 nova_compute[225701]: 2026-01-23 10:18:59.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:59 np0005593295 nova_compute[225701]: 2026-01-23 10:18:59.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:18:59 np0005593295 nova_compute[225701]: 2026-01-23 10:18:59.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:18:59 np0005593295 nova_compute[225701]: 2026-01-23 10:18:59.813 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:18:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:18:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:18:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:19:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:00.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:01 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:01 np0005593295 nova_compute[225701]: 2026-01-23 10:19:01.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:02.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:02 np0005593295 nova_compute[225701]: 2026-01-23 10:19:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:02 np0005593295 nova_compute[225701]: 2026-01-23 10:19:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:02 np0005593295 nova_compute[225701]: 2026-01-23 10:19:02.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:19:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:02.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:02 np0005593295 nova_compute[225701]: 2026-01-23 10:19:02.918 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:19:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:03 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:03 np0005593295 nova_compute[225701]: 2026-01-23 10:19:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:03 np0005593295 nova_compute[225701]: 2026-01-23 10:19:03.827 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:03 np0005593295 nova_compute[225701]: 2026-01-23 10:19:03.827 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:03 np0005593295 nova_compute[225701]: 2026-01-23 10:19:03.827 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:03 np0005593295 nova_compute[225701]: 2026-01-23 10:19:03.828 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:19:03 np0005593295 nova_compute[225701]: 2026-01-23 10:19:03.828 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:04.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.234 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:19:04 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2031067823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.336 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0041b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.547 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.549 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4919MB free_disk=59.942726135253906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.549 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.550 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.620 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.621 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:19:04 np0005593295 nova_compute[225701]: 2026-01-23 10:19:04.646 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:04.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:19:05 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4045296728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:05 np0005593295 nova_compute[225701]: 2026-01-23 10:19:05.082 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:05 np0005593295 nova_compute[225701]: 2026-01-23 10:19:05.088 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:05 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:05 np0005593295 nova_compute[225701]: 2026-01-23 10:19:05.122 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:05 np0005593295 nova_compute[225701]: 2026-01-23 10:19:05.123 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:19:05 np0005593295 nova_compute[225701]: 2026-01-23 10:19:05.124 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:06.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:06.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:07 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0041d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:07 np0005593295 nova_compute[225701]: 2026-01-23 10:19:07.119 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:07 np0005593295 nova_compute[225701]: 2026-01-23 10:19:07.151 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:07 np0005593295 nova_compute[225701]: 2026-01-23 10:19:07.152 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:07 np0005593295 nova_compute[225701]: 2026-01-23 10:19:07.153 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:07 np0005593295 nova_compute[225701]: 2026-01-23 10:19:07.957 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:08.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:08.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:09 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:09 np0005593295 nova_compute[225701]: 2026-01-23 10:19:09.301 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/101909 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:19:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:09 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:19:09.846 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:09 np0005593295 nova_compute[225701]: 2026-01-23 10:19:09.846 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:09 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:19:09.847 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:19:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:10.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:10.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:11 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:12.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004210 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:12 np0005593295 nova_compute[225701]: 2026-01-23 10:19:12.959 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:19:13.850 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:14.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:14 np0005593295 nova_compute[225701]: 2026-01-23 10:19:14.304 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:14 np0005593295 podman[230656]: 2026-01-23 10:19:14.712874745 +0000 UTC m=+0.079217487 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:19:14 np0005593295 podman[230655]: 2026-01-23 10:19:14.752222362 +0000 UTC m=+0.124137248 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 05:19:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:14.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:15 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:16.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:16.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:19:17 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/613545648' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:17 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:19:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:19:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:18 np0005593295 nova_compute[225701]: 2026-01-23 10:19:18.014 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:18.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:18.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ecc004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:19 np0005593295 nova_compute[225701]: 2026-01-23 10:19:19.307 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:20.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:20.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:21 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:21 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:21 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:19:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:22.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:22.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:23 np0005593295 nova_compute[225701]: 2026-01-23 10:19:23.016 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:23 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:24.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:24 np0005593295 nova_compute[225701]: 2026-01-23 10:19:24.311 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:24.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:25 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:26.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:26 np0005593295 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 05:19:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:26.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:27 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:28 np0005593295 nova_compute[225701]: 2026-01-23 10:19:28.018 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:28.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0042b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:28.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:29 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:29 np0005593295 nova_compute[225701]: 2026-01-23 10:19:29.314 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:30.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:30.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:31 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0042d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:32.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:32.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:33 np0005593295 nova_compute[225701]: 2026-01-23 10:19:33.019 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:33 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0042f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:34 np0005593295 nova_compute[225701]: 2026-01-23 10:19:34.317 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:34.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:34.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:35 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004310 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:36.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:36.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:37 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:38 np0005593295 nova_compute[225701]: 2026-01-23 10:19:38.021 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004310 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:38.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:38.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:39 np0005593295 nova_compute[225701]: 2026-01-23 10:19:39.319 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:40.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:40.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004330 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:42.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:42.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:43 np0005593295 nova_compute[225701]: 2026-01-23 10:19:43.059 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:44 np0005593295 nova_compute[225701]: 2026-01-23 10:19:44.330 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:44.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:44.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:45 np0005593295 podman[230892]: 2026-01-23 10:19:45.633438679 +0000 UTC m=+0.059250792 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:19:45 np0005593295 podman[230891]: 2026-01-23 10:19:45.65941245 +0000 UTC m=+0.084003241 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 05:19:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004390 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:46.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:46.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:48 np0005593295 nova_compute[225701]: 2026-01-23 10:19:48.111 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:48.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:48.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:49 np0005593295 nova_compute[225701]: 2026-01-23 10:19:49.333 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:50.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:52.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:52.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:53 np0005593295 nova_compute[225701]: 2026-01-23 10:19:53.115 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:53 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:54 np0005593295 nova_compute[225701]: 2026-01-23 10:19:54.335 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:54.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:54.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:55 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:19:55.485 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:19:55.486 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:19:55.486 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:56.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:56.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:57 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004570 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004570 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:58 np0005593295 nova_compute[225701]: 2026-01-23 10:19:58.118 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:58.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:58 np0005593295 nova_compute[225701]: 2026-01-23 10:19:58.812 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:19:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:19:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:58.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:19:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:19:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:19:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:19:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:19:59 np0005593295 nova_compute[225701]: 2026-01-23 10:19:59.368 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:19:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:00.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:00 np0005593295 nova_compute[225701]: 2026-01-23 10:20:00.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:00 np0005593295 nova_compute[225701]: 2026-01-23 10:20:00.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:20:00 np0005593295 nova_compute[225701]: 2026-01-23 10:20:00.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:20:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:00 np0005593295 nova_compute[225701]: 2026-01-23 10:20:00.799 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:20:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:00.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:01 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:02.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:02 np0005593295 nova_compute[225701]: 2026-01-23 10:20:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:02.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:03 np0005593295 nova_compute[225701]: 2026-01-23 10:20:03.121 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:03 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0045b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:03 np0005593295 ceph-mds[83039]: mds.beacon.cephfs.compute-2.prgzmm missed beacon ack from the monitors
Jan 23 05:20:03 np0005593295 nova_compute[225701]: 2026-01-23 10:20:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:03 np0005593295 nova_compute[225701]: 2026-01-23 10:20:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:03 np0005593295 nova_compute[225701]: 2026-01-23 10:20:03.816 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:03 np0005593295 nova_compute[225701]: 2026-01-23 10:20:03.816 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:03 np0005593295 nova_compute[225701]: 2026-01-23 10:20:03.816 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:03 np0005593295 nova_compute[225701]: 2026-01-23 10:20:03.817 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:20:03 np0005593295 nova_compute[225701]: 2026-01-23 10:20:03.817 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:04 np0005593295 nova_compute[225701]: 2026-01-23 10:20:04.371 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:04.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:04.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:05 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:06 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_commit, latency = 5.380156040s
Jan 23 05:20:06 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_sync, latency = 5.380156517s
Jan 23 05:20:06 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.380500793s, txc = 0x559226356c00
Jan 23 05:20:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 05:20:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 05:20:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 05:20:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 05:20:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:06.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:06 np0005593295 nova_compute[225701]: 2026-01-23 10:20:06.617 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.799s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:06 np0005593295 nova_compute[225701]: 2026-01-23 10:20:06.793 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:20:06 np0005593295 nova_compute[225701]: 2026-01-23 10:20:06.794 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4912MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:20:06 np0005593295 nova_compute[225701]: 2026-01-23 10:20:06.795 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:06 np0005593295 nova_compute[225701]: 2026-01-23 10:20:06.795 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:06 np0005593295 nova_compute[225701]: 2026-01-23 10:20:06.874 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:20:06 np0005593295 nova_compute[225701]: 2026-01-23 10:20:06.874 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:20:06 np0005593295 nova_compute[225701]: 2026-01-23 10:20:06.892 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:06.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:07 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:20:07 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1702015841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:20:07 np0005593295 nova_compute[225701]: 2026-01-23 10:20:07.339 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:07 np0005593295 nova_compute[225701]: 2026-01-23 10:20:07.344 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:20:07 np0005593295 nova_compute[225701]: 2026-01-23 10:20:07.361 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:20:07 np0005593295 nova_compute[225701]: 2026-01-23 10:20:07.362 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:20:07 np0005593295 nova_compute[225701]: 2026-01-23 10:20:07.362 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:08 np0005593295 nova_compute[225701]: 2026-01-23 10:20:08.123 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:08 np0005593295 nova_compute[225701]: 2026-01-23 10:20:08.363 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:08 np0005593295 nova_compute[225701]: 2026-01-23 10:20:08.364 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:08 np0005593295 nova_compute[225701]: 2026-01-23 10:20:08.364 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:08 np0005593295 nova_compute[225701]: 2026-01-23 10:20:08.364 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:20:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:08.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:08 np0005593295 nova_compute[225701]: 2026-01-23 10:20:08.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:08.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:08 np0005593295 ceph-mon[75771]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:20:08 np0005593295 ceph-mon[75771]: mon.compute-1 calling monitor election
Jan 23 05:20:08 np0005593295 ceph-mon[75771]: mon.compute-0 calling monitor election
Jan 23 05:20:08 np0005593295 ceph-mon[75771]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 05:20:08 np0005593295 ceph-mon[75771]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:20:08 np0005593295 ceph-mon[75771]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:20:08 np0005593295 ceph-mon[75771]:     osd.1 observed slow operation indications in BlueStore
Jan 23 05:20:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:09 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:09 np0005593295 nova_compute[225701]: 2026-01-23 10:20:09.374 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:10.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:10.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:11 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004610 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:12.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:12.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:13 np0005593295 ceph-mon[75771]: Health check update: 2 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 23 05:20:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:13 np0005593295 nova_compute[225701]: 2026-01-23 10:20:13.125 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:14 np0005593295 nova_compute[225701]: 2026-01-23 10:20:14.377 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:14.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:14.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:15 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:16 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:16.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:16 np0005593295 podman[231043]: 2026-01-23 10:20:16.665790278 +0000 UTC m=+0.074812035 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 05:20:16 np0005593295 podman[231042]: 2026-01-23 10:20:16.687763665 +0000 UTC m=+0.111199594 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:20:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:16.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:17 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:18 np0005593295 nova_compute[225701]: 2026-01-23 10:20:18.126 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:18.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:18 np0005593295 nova_compute[225701]: 2026-01-23 10:20:18.564 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:18 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:20:18.564 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:20:18 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:20:18.566 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:20:18 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:20:18.568 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:18.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:19 np0005593295 nova_compute[225701]: 2026-01-23 10:20:19.571 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:20.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:20.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:21 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8003500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:20:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:22.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:20:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:22.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:23 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:23 np0005593295 nova_compute[225701]: 2026-01-23 10:20:23.189 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:24.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:24 np0005593295 nova_compute[225701]: 2026-01-23 10:20:24.575 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:24.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:25 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:26.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:26.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:27 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:28 np0005593295 nova_compute[225701]: 2026-01-23 10:20:28.195 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:28.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:28.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:29 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:29 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:29 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:29 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:20:29 np0005593295 nova_compute[225701]: 2026-01-23 10:20:29.577 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:30 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:20:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:30.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:30.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:31 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:32.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:32.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:33 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:33 np0005593295 nova_compute[225701]: 2026-01-23 10:20:33.234 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:20:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:34.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:20:34 np0005593295 nova_compute[225701]: 2026-01-23 10:20:34.580 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:34.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:35 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:35 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:20:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:35 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:36.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:36.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:37 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:38 np0005593295 nova_compute[225701]: 2026-01-23 10:20:38.237 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:38.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:39 np0005593295 nova_compute[225701]: 2026-01-23 10:20:39.582 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:20:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5073 writes, 27K keys, 5073 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5073 writes, 5073 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1503 writes, 7262 keys, 1503 commit groups, 1.0 writes per commit group, ingest: 17.13 MB, 0.03 MB/s#012Interval WAL: 1504 writes, 1504 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     88.0      0.43              0.17        14    0.031       0      0       0.0       0.0#012  L6      1/0   12.23 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   4.3     86.9     75.1      2.14              0.71        13    0.165     68K   6779       0.0       0.0#012 Sum      1/0   12.23 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.3     72.5     77.2      2.57              0.88        27    0.095     68K   6779       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.3    102.5    101.7      0.71              0.24        10    0.071     29K   2602       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   0.0     86.9     75.1      2.14              0.71        13    0.165     68K   6779       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     88.5      0.43              0.17        13    0.033       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.037, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.19 GB write, 0.11 MB/s write, 0.18 GB read, 0.10 MB/s read, 2.6 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 13.48 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000126 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(716,12.93 MB,4.25455%) FilterBlock(27,201.17 KB,0.064624%) IndexBlock(27,355.48 KB,0.114195%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:20:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:40.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:42.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:42.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:43 np0005593295 nova_compute[225701]: 2026-01-23 10:20:43.239 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:44.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:44 np0005593295 nova_compute[225701]: 2026-01-23 10:20:44.593 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:44.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:46.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:46.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:47 np0005593295 podman[231280]: 2026-01-23 10:20:47.650935376 +0000 UTC m=+0.062769223 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:20:47 np0005593295 podman[231279]: 2026-01-23 10:20:47.679808852 +0000 UTC m=+0.104317246 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:20:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:48 np0005593295 nova_compute[225701]: 2026-01-23 10:20:48.240 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:48.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:48.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:49 np0005593295 nova_compute[225701]: 2026-01-23 10:20:49.595 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:50.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:51.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:51 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:52.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:53.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:53 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec004bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:53 np0005593295 nova_compute[225701]: 2026-01-23 10:20:53.282 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:54 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:20:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:54.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:20:54 np0005593295 nova_compute[225701]: 2026-01-23 10:20:54.629 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:55.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:55 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:20:55.487 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:20:55.488 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:20:55.488 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:56 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:56.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:57.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:57 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:58 np0005593295 nova_compute[225701]: 2026-01-23 10:20:58.285 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:58 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:20:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:58.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:20:58 np0005593295 nova_compute[225701]: 2026-01-23 10:20:58.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:20:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:59.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:20:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:20:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:20:59 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:20:59 np0005593295 nova_compute[225701]: 2026-01-23 10:20:59.632 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:20:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:00 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:00.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:21:00.662 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:21:00.663 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:21:00 np0005593295 nova_compute[225701]: 2026-01-23 10:21:00.665 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:01.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:01 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102101 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:21:01 np0005593295 nova_compute[225701]: 2026-01-23 10:21:01.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:01 np0005593295 nova_compute[225701]: 2026-01-23 10:21:01.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:21:01 np0005593295 nova_compute[225701]: 2026-01-23 10:21:01.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:21:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:01 np0005593295 nova_compute[225701]: 2026-01-23 10:21:01.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:21:01 np0005593295 nova_compute[225701]: 2026-01-23 10:21:01.801 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:01 np0005593295 nova_compute[225701]: 2026-01-23 10:21:01.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:21:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:02 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:02.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:02 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:21:02.665 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:03.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:03 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:03 np0005593295 nova_compute[225701]: 2026-01-23 10:21:03.288 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:03 np0005593295 nova_compute[225701]: 2026-01-23 10:21:03.797 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:04 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:21:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:04.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:21:04 np0005593295 nova_compute[225701]: 2026-01-23 10:21:04.660 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:21:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:05.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:21:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:05 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec0003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:05 np0005593295 nova_compute[225701]: 2026-01-23 10:21:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:05 np0005593295 nova_compute[225701]: 2026-01-23 10:21:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:05 np0005593295 nova_compute[225701]: 2026-01-23 10:21:05.813 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:05 np0005593295 nova_compute[225701]: 2026-01-23 10:21:05.813 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:05 np0005593295 nova_compute[225701]: 2026-01-23 10:21:05.814 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:05 np0005593295 nova_compute[225701]: 2026-01-23 10:21:05.814 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:21:05 np0005593295 nova_compute[225701]: 2026-01-23 10:21:05.814 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:21:06 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1928470914' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:06 np0005593295 nova_compute[225701]: 2026-01-23 10:21:06.290 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:06 np0005593295 nova_compute[225701]: 2026-01-23 10:21:06.455 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:21:06 np0005593295 nova_compute[225701]: 2026-01-23 10:21:06.456 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4905MB free_disk=59.94269943237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:21:06 np0005593295 nova_compute[225701]: 2026-01-23 10:21:06.456 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:06 np0005593295 nova_compute[225701]: 2026-01-23 10:21:06.456 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:06 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:06.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:06 np0005593295 nova_compute[225701]: 2026-01-23 10:21:06.603 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:21:06 np0005593295 nova_compute[225701]: 2026-01-23 10:21:06.604 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:21:06 np0005593295 nova_compute[225701]: 2026-01-23 10:21:06.675 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:07.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:21:07 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/731693582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:07 np0005593295 nova_compute[225701]: 2026-01-23 10:21:07.084 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:07 np0005593295 nova_compute[225701]: 2026-01-23 10:21:07.090 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:21:07 np0005593295 nova_compute[225701]: 2026-01-23 10:21:07.103 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:21:07 np0005593295 nova_compute[225701]: 2026-01-23 10:21:07.105 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:21:07 np0005593295 nova_compute[225701]: 2026-01-23 10:21:07.105 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:07 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:08 np0005593295 nova_compute[225701]: 2026-01-23 10:21:08.101 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:08 np0005593295 nova_compute[225701]: 2026-01-23 10:21:08.119 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:08 np0005593295 nova_compute[225701]: 2026-01-23 10:21:08.119 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:08 np0005593295 nova_compute[225701]: 2026-01-23 10:21:08.119 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:21:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:08 np0005593295 nova_compute[225701]: 2026-01-23 10:21:08.290 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:08 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:08.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:08 np0005593295 nova_compute[225701]: 2026-01-23 10:21:08.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:08 np0005593295 nova_compute[225701]: 2026-01-23 10:21:08.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:09.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:09 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:09 np0005593295 nova_compute[225701]: 2026-01-23 10:21:09.675 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:09 np0005593295 nova_compute[225701]: 2026-01-23 10:21:09.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:21:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:10 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:21:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:10.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:21:10 np0005593295 nova_compute[225701]: 2026-01-23 10:21:10.806 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:10 np0005593295 nova_compute[225701]: 2026-01-23 10:21:10.806 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:21:10 np0005593295 nova_compute[225701]: 2026-01-23 10:21:10.923 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:21:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:11.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:11 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:12 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:12.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:13.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:21:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:21:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:13 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:13 np0005593295 nova_compute[225701]: 2026-01-23 10:21:13.348 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:14 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:14.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:14 np0005593295 nova_compute[225701]: 2026-01-23 10:21:14.729 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:15.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:15 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:21:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:16 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:16.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:17.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:17 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:18 np0005593295 nova_compute[225701]: 2026-01-23 10:21:18.350 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:18 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ed0002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:18.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:18 np0005593295 podman[231427]: 2026-01-23 10:21:18.643433905 +0000 UTC m=+0.072705110 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:21:18 np0005593295 podman[231426]: 2026-01-23 10:21:18.644225966 +0000 UTC m=+0.075841572 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:21:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:19.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:19 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:19 np0005593295 nova_compute[225701]: 2026-01-23 10:21:19.732 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:20 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:20.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:21.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:21 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102121 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:21:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:22 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:21:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:22.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:21:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:23.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:23 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:23 np0005593295 nova_compute[225701]: 2026-01-23 10:21:23.351 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:24 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:21:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:24.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:21:24 np0005593295 nova_compute[225701]: 2026-01-23 10:21:24.734 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:25.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:25 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec00047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:26 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:26.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:27.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:27 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:28 np0005593295 nova_compute[225701]: 2026-01-23 10:21:28.375 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:28 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee80028a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:28.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:29.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:29 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80017c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:29 np0005593295 nova_compute[225701]: 2026-01-23 10:21:29.783 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:30 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:30.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:31.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:31 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001940 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:32 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:32.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:33.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:33 np0005593295 nova_compute[225701]: 2026-01-23 10:21:33.377 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:33 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:34 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001940 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:21:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:34.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:21:34 np0005593295 nova_compute[225701]: 2026-01-23 10:21:34.785 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:35.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:35 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:36 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:36.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:37.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:37 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8001ae0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:38 np0005593295 nova_compute[225701]: 2026-01-23 10:21:38.382 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:38 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:38.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:39.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:39 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:39 np0005593295 nova_compute[225701]: 2026-01-23 10:21:39.836 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80035d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:40 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:21:40 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:40 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:40 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:21:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:40 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c270 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:40.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:41.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:41 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:42 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec80035d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:42.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:43.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:43 np0005593295 nova_compute[225701]: 2026-01-23 10:21:43.419 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:43 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:44 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:44.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:44 np0005593295 nova_compute[225701]: 2026-01-23 10:21:44.839 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:45.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:45 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:45 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:21:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:45 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:46 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:46.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:46 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 23 05:21:46 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:46.869627) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:21:46 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 23 05:21:46 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706869866, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2404, "num_deletes": 251, "total_data_size": 6597834, "memory_usage": 6707424, "flush_reason": "Manual Compaction"}
Jan 23 05:21:46 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 23 05:21:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:47.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:47 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708075384, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4220070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25949, "largest_seqno": 28348, "table_properties": {"data_size": 4210435, "index_size": 6065, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20450, "raw_average_key_size": 20, "raw_value_size": 4190923, "raw_average_value_size": 4207, "num_data_blocks": 262, "num_entries": 996, "num_filter_entries": 996, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163491, "oldest_key_time": 1769163491, "file_creation_time": 1769163706, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:21:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 1366208 microseconds, and 10178 cpu microseconds.
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.075559) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4220070 bytes OK
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.235949) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.241027) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.241075) EVENT_LOG_v1 {"time_micros": 1769163708241064, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.241114) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6587235, prev total WAL file size 6589174, number of live WAL files 2.
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.242960) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(4121KB)], [51(12MB)]
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708243148, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17044356, "oldest_snapshot_seqno": -1}
Jan 23 05:21:48 np0005593295 nova_compute[225701]: 2026-01-23 10:21:48.475 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5924 keys, 14851976 bytes, temperature: kUnknown
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708537791, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14851976, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14811093, "index_size": 24965, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14853, "raw_key_size": 150592, "raw_average_key_size": 25, "raw_value_size": 14702654, "raw_average_value_size": 2481, "num_data_blocks": 1017, "num_entries": 5924, "num_filter_entries": 5924, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163708, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:21:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:48 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c2d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:21:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:48.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.538143) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14851976 bytes
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.781419) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 57.8 rd, 50.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.2 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(7.6) write-amplify(3.5) OK, records in: 6447, records dropped: 523 output_compression: NoCompression
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.781460) EVENT_LOG_v1 {"time_micros": 1769163708781445, "job": 30, "event": "compaction_finished", "compaction_time_micros": 294774, "compaction_time_cpu_micros": 37573, "output_level": 6, "num_output_files": 1, "total_output_size": 14851976, "num_input_records": 6447, "num_output_records": 5924, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708782406, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163708784519, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.242801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784577) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:21:48.784583) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:21:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:49.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:49 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:49 np0005593295 podman[231730]: 2026-01-23 10:21:49.641586862 +0000 UTC m=+0.056916051 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:21:49 np0005593295 podman[231729]: 2026-01-23 10:21:49.743501886 +0000 UTC m=+0.165190279 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:21:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:49 np0005593295 nova_compute[225701]: 2026-01-23 10:21:49.841 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ee8004b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:50 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2ec8003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:50.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:50 np0005593295 nova_compute[225701]: 2026-01-23 10:21:50.780 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:51.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:51 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2edc00c2f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:21:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:52 np0005593295 kernel: ganesha.nfsd[231499]: segfault at 50 ip 00007f2f772a832e sp 00007f2efe7fb210 error 4 in libntirpc.so.5.8[7f2f7728d000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 23 05:21:52 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:21:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[228963]: 23/01/2026 10:21:52 : epoch 69734a7a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2eec0047c0 fd 38 proxy ignored for local
Jan 23 05:21:52 np0005593295 systemd[1]: Started Process Core Dump (PID 231774/UID 0).
Jan 23 05:21:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:21:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:52.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:21:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:53.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:53 np0005593295 systemd-coredump[231775]: Process 228967 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 75:#012#0  0x00007f2f772a832e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:21:53 np0005593295 nova_compute[225701]: 2026-01-23 10:21:53.476 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:53 np0005593295 systemd[1]: systemd-coredump@11-231774-0.service: Deactivated successfully.
Jan 23 05:21:53 np0005593295 systemd[1]: systemd-coredump@11-231774-0.service: Consumed 1.174s CPU time.
Jan 23 05:21:53 np0005593295 podman[231782]: 2026-01-23 10:21:53.544260032 +0000 UTC m=+0.023946659 container died fa785a85e35a7804c787f20020accc24473951046161ad46c7682dbaa03899c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:21:53 np0005593295 systemd[1]: var-lib-containers-storage-overlay-d753937e541bf38247f02c0eae4c66ab31e8c5f3996cd75a5da7d39e87934a76-merged.mount: Deactivated successfully.
Jan 23 05:21:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:54 np0005593295 podman[231782]: 2026-01-23 10:21:54.096959953 +0000 UTC m=+0.576646590 container remove fa785a85e35a7804c787f20020accc24473951046161ad46c7682dbaa03899c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:21:54 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:21:54 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:21:54 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.950s CPU time.
Jan 23 05:21:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 05:21:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:54.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 05:21:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:54 np0005593295 nova_compute[225701]: 2026-01-23 10:21:54.842 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:55.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:21:55.488 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:21:55.488 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:21:55.489 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:56.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:57.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102158 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:21:58 np0005593295 nova_compute[225701]: 2026-01-23 10:21:58.478 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:58.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:58 np0005593295 nova_compute[225701]: 2026-01-23 10:21:58.802 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:21:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:21:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:59.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:21:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:21:59 np0005593295 nova_compute[225701]: 2026-01-23 10:21:59.884 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:00.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:01.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:22:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 7795 writes, 32K keys, 7795 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 7795 writes, 1759 syncs, 4.43 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1988 writes, 7632 keys, 1988 commit groups, 1.0 writes per commit group, ingest: 8.26 MB, 0.01 MB/s#012Interval WAL: 1988 writes, 772 syncs, 2.58 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:22:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:02.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:03.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:03 np0005593295 nova_compute[225701]: 2026-01-23 10:22:03.479 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:03 np0005593295 nova_compute[225701]: 2026-01-23 10:22:03.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:03 np0005593295 nova_compute[225701]: 2026-01-23 10:22:03.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:22:03 np0005593295 nova_compute[225701]: 2026-01-23 10:22:03.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:22:03 np0005593295 nova_compute[225701]: 2026-01-23 10:22:03.814 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:22:03 np0005593295 nova_compute[225701]: 2026-01-23 10:22:03.814 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:04 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 12.
Jan 23 05:22:04 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:22:04 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.950s CPU time.
Jan 23 05:22:04 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:22:04 np0005593295 podman[231885]: 2026-01-23 10:22:04.49453168 +0000 UTC m=+0.037889860 container create f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 23 05:22:04 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:04 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:04 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:04 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:04 np0005593295 podman[231885]: 2026-01-23 10:22:04.556786209 +0000 UTC m=+0.100144399 container init f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:22:04 np0005593295 podman[231885]: 2026-01-23 10:22:04.56263667 +0000 UTC m=+0.105994830 container start f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:22:04 np0005593295 bash[231885]: f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39
Jan 23 05:22:04 np0005593295 podman[231885]: 2026-01-23 10:22:04.478030074 +0000 UTC m=+0.021388254 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:22:04 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:22:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:04.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:04 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:22:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:04 np0005593295 nova_compute[225701]: 2026-01-23 10:22:04.886 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:05.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:06.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:06 np0005593295 nova_compute[225701]: 2026-01-23 10:22:06.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:06 np0005593295 nova_compute[225701]: 2026-01-23 10:22:06.812 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:06 np0005593295 nova_compute[225701]: 2026-01-23 10:22:06.813 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:06 np0005593295 nova_compute[225701]: 2026-01-23 10:22:06.813 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:06 np0005593295 nova_compute[225701]: 2026-01-23 10:22:06.813 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:22:06 np0005593295 nova_compute[225701]: 2026-01-23 10:22:06.814 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:07.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:22:07 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2351850048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:07 np0005593295 nova_compute[225701]: 2026-01-23 10:22:07.280 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:07 np0005593295 nova_compute[225701]: 2026-01-23 10:22:07.437 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:22:07 np0005593295 nova_compute[225701]: 2026-01-23 10:22:07.438 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4844MB free_disk=59.942752838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:22:07 np0005593295 nova_compute[225701]: 2026-01-23 10:22:07.438 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:07 np0005593295 nova_compute[225701]: 2026-01-23 10:22:07.438 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:07 np0005593295 nova_compute[225701]: 2026-01-23 10:22:07.994 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:22:07 np0005593295 nova_compute[225701]: 2026-01-23 10:22:07.994 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:22:08 np0005593295 nova_compute[225701]: 2026-01-23 10:22:08.021 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:22:08 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1160861138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:08 np0005593295 nova_compute[225701]: 2026-01-23 10:22:08.454 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:08 np0005593295 nova_compute[225701]: 2026-01-23 10:22:08.462 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:22:08 np0005593295 nova_compute[225701]: 2026-01-23 10:22:08.481 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:08 np0005593295 nova_compute[225701]: 2026-01-23 10:22:08.515 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:22:08 np0005593295 nova_compute[225701]: 2026-01-23 10:22:08.518 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:22:08 np0005593295 nova_compute[225701]: 2026-01-23 10:22:08.518 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:08.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:09.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:09 np0005593295 nova_compute[225701]: 2026-01-23 10:22:09.519 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593295 nova_compute[225701]: 2026-01-23 10:22:09.520 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593295 nova_compute[225701]: 2026-01-23 10:22:09.520 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593295 nova_compute[225701]: 2026-01-23 10:22:09.520 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593295 nova_compute[225701]: 2026-01-23 10:22:09.520 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:22:09 np0005593295 nova_compute[225701]: 2026-01-23 10:22:09.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:09 np0005593295 nova_compute[225701]: 2026-01-23 10:22:09.933 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:10.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:10 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:22:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:10 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:22:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:11.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:12.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:13.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:13 np0005593295 nova_compute[225701]: 2026-01-23 10:22:13.515 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:14.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:14 np0005593295 nova_compute[225701]: 2026-01-23 10:22:14.936 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:15.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:16.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:22:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:16 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:22:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:17.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:17 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:18 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:18 np0005593295 nova_compute[225701]: 2026-01-23 10:22:18.517 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:18 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:18.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:19.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:19 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:19 np0005593295 nova_compute[225701]: 2026-01-23 10:22:19.982 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102220 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:22:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:20 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde8000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:20 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:20 np0005593295 podman[232046]: 2026-01-23 10:22:20.621934199 +0000 UTC m=+0.043429663 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:22:20 np0005593295 podman[232045]: 2026-01-23 10:22:20.652125279 +0000 UTC m=+0.075435789 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:22:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:20.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:21.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:21 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:22 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:22 np0005593295 nova_compute[225701]: 2026-01-23 10:22:22.618 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:22 np0005593295 nova_compute[225701]: 2026-01-23 10:22:22.618 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:22 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:22 np0005593295 nova_compute[225701]: 2026-01-23 10:22:22.646 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:22:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:22.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:22 np0005593295 nova_compute[225701]: 2026-01-23 10:22:22.737 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:22 np0005593295 nova_compute[225701]: 2026-01-23 10:22:22.737 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:22 np0005593295 nova_compute[225701]: 2026-01-23 10:22:22.743 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:22:22 np0005593295 nova_compute[225701]: 2026-01-23 10:22:22.743 225706 INFO nova.compute.claims [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:22:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:22 np0005593295 nova_compute[225701]: 2026-01-23 10:22:22.887 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:23.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:22:23 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1943420009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.361 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.366 225706 DEBUG nova.compute.provider_tree [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.381 225706 DEBUG nova.scheduler.client.report [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.400 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.401 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.454 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.454 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:22:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:23 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.475 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.505 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.520 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.624 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.626 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.626 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Creating image(s)#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.657 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.686 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.709 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.712 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.731 225706 DEBUG nova.policy [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.769 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.770 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.772 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.772 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.802 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:23 np0005593295 nova_compute[225701]: 2026-01-23 10:22:23.804 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:24 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:24 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:24.264 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.264 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:24 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:24.266 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.457 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.652s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.528 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.567 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Successfully created port: 06aeb511-67a6-4547-b061-9c4760285e3b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:22:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:24 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.633 225706 DEBUG nova.objects.instance [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid b8ea49c6-5f62-47b0-92cc-7399bfc98528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.652 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.652 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Ensure instance console log exists: /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.653 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.653 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.654 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:24.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:24 np0005593295 nova_compute[225701]: 2026-01-23 10:22:24.984 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:25.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:25 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:25 np0005593295 nova_compute[225701]: 2026-01-23 10:22:25.498 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Successfully updated port: 06aeb511-67a6-4547-b061-9c4760285e3b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:22:25 np0005593295 nova_compute[225701]: 2026-01-23 10:22:25.515 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:25 np0005593295 nova_compute[225701]: 2026-01-23 10:22:25.516 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:25 np0005593295 nova_compute[225701]: 2026-01-23 10:22:25.516 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:22:25 np0005593295 nova_compute[225701]: 2026-01-23 10:22:25.616 225706 DEBUG nova.compute.manager [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-changed-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:25 np0005593295 nova_compute[225701]: 2026-01-23 10:22:25.617 225706 DEBUG nova.compute.manager [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Refreshing instance network info cache due to event network-changed-06aeb511-67a6-4547-b061-9c4760285e3b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:22:25 np0005593295 nova_compute[225701]: 2026-01-23 10:22:25.618 225706 DEBUG oslo_concurrency.lockutils [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:25 np0005593295 nova_compute[225701]: 2026-01-23 10:22:25.699 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:22:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:26 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.558 225706 DEBUG nova.network.neutron [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Updating instance_info_cache with network_info: [{"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:26 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:26.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.734 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.734 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance network_info: |[{"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.735 225706 DEBUG oslo_concurrency.lockutils [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.735 225706 DEBUG nova.network.neutron [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Refreshing network info cache for port 06aeb511-67a6-4547-b061-9c4760285e3b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.737 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start _get_guest_xml network_info=[{"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.741 225706 WARNING nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.745 225706 DEBUG nova.virt.libvirt.host [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.745 225706 DEBUG nova.virt.libvirt.host [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.749 225706 DEBUG nova.virt.libvirt.host [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.749 225706 DEBUG nova.virt.libvirt.host [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.750 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.750 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.751 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.751 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.751 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.752 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.752 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.752 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.753 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.753 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.753 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.754 225706 DEBUG nova.virt.hardware [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:22:26 np0005593295 nova_compute[225701]: 2026-01-23 10:22:26.758 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:27.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:22:27 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/815013215' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.332 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.573s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.368 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.375 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:27 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:22:27 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2638325462' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.840 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.842 225706 DEBUG nova.virt.libvirt.vif [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:22:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1878761076',display_name='tempest-TestNetworkBasicOps-server-1878761076',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1878761076',id=7,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnVJwsrFV8aI2nrJoEZl6VyyMwYmX81xfzmKsfGpDRm0DGXIQaGmDmPINRbdeF1kx8Y5VA3JSgU3fPoWzBbPsDeXm0p5hq8BrMWr1cPqMrGzO08egHCDlwB5XDUgBL1OA==',key_name='tempest-TestNetworkBasicOps-1503023412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-qvac9ktg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:22:23Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=b8ea49c6-5f62-47b0-92cc-7399bfc98528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.842 225706 DEBUG nova.network.os_vif_util [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.843 225706 DEBUG nova.network.os_vif_util [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.844 225706 DEBUG nova.objects.instance [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid b8ea49c6-5f62-47b0-92cc-7399bfc98528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.967 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <uuid>b8ea49c6-5f62-47b0-92cc-7399bfc98528</uuid>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <name>instance-00000007</name>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <memory>131072</memory>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <vcpu>1</vcpu>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <metadata>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <nova:name>tempest-TestNetworkBasicOps-server-1878761076</nova:name>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <nova:creationTime>2026-01-23 10:22:26</nova:creationTime>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <nova:flavor name="m1.nano">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <nova:memory>128</nova:memory>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <nova:disk>1</nova:disk>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <nova:swap>0</nova:swap>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      </nova:flavor>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <nova:owner>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      </nova:owner>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <nova:ports>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <nova:port uuid="06aeb511-67a6-4547-b061-9c4760285e3b">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:          <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        </nova:port>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      </nova:ports>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </nova:instance>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  </metadata>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <sysinfo type="smbios">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <system>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <entry name="serial">b8ea49c6-5f62-47b0-92cc-7399bfc98528</entry>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <entry name="uuid">b8ea49c6-5f62-47b0-92cc-7399bfc98528</entry>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </system>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  </sysinfo>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <os>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <boot dev="hd"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <smbios mode="sysinfo"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  </os>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <features>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <acpi/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <apic/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <vmcoreinfo/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  </features>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <clock offset="utc">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <timer name="hpet" present="no"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  </clock>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <cpu mode="host-model" match="exact">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  </cpu>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  <devices>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <disk type="network" device="disk">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <driver type="raw" cache="none"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <source protocol="rbd" name="vms/b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      </source>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <auth username="openstack">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      </auth>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <target dev="vda" bus="virtio"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <disk type="network" device="cdrom">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <driver type="raw" cache="none"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <source protocol="rbd" name="vms/b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      </source>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <auth username="openstack">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      </auth>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <target dev="sda" bus="sata"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <interface type="ethernet">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <mac address="fa:16:3e:39:69:9c"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <model type="virtio"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <mtu size="1442"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <target dev="tap06aeb511-67"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </interface>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <serial type="pty">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <log file="/var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/console.log" append="off"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </serial>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <video>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <model type="virtio"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </video>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <input type="tablet" bus="usb"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <rng model="virtio">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </rng>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <controller type="usb" index="0"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    <memballoon model="virtio">
Jan 23 05:22:27 np0005593295 nova_compute[225701]:      <stats period="10"/>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:    </memballoon>
Jan 23 05:22:27 np0005593295 nova_compute[225701]:  </devices>
Jan 23 05:22:27 np0005593295 nova_compute[225701]: </domain>
Jan 23 05:22:27 np0005593295 nova_compute[225701]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.968 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Preparing to wait for external event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.969 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.969 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.969 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.970 225706 DEBUG nova.virt.libvirt.vif [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:22:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1878761076',display_name='tempest-TestNetworkBasicOps-server-1878761076',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1878761076',id=7,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnVJwsrFV8aI2nrJoEZl6VyyMwYmX81xfzmKsfGpDRm0DGXIQaGmDmPINRbdeF1kx8Y5VA3JSgU3fPoWzBbPsDeXm0p5hq8BrMWr1cPqMrGzO08egHCDlwB5XDUgBL1OA==',key_name='tempest-TestNetworkBasicOps-1503023412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-qvac9ktg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:22:23Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=b8ea49c6-5f62-47b0-92cc-7399bfc98528,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.970 225706 DEBUG nova.network.os_vif_util [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.971 225706 DEBUG nova.network.os_vif_util [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.971 225706 DEBUG os_vif [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.972 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.973 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.973 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.977 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.977 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06aeb511-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.978 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06aeb511-67, col_values=(('external_ids', {'iface-id': '06aeb511-67a6-4547-b061-9c4760285e3b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:69:9c', 'vm-uuid': 'b8ea49c6-5f62-47b0-92cc-7399bfc98528'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.979 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:27 np0005593295 NetworkManager[48964]: <info>  [1769163747.9808] manager: (tap06aeb511-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.981 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.987 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:27 np0005593295 nova_compute[225701]: 2026-01-23 10:22:27.988 225706 INFO os_vif [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67')#033[00m
Jan 23 05:22:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.170 225706 DEBUG nova.network.neutron [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Updated VIF entry in instance network info cache for port 06aeb511-67a6-4547-b061-9c4760285e3b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.172 225706 DEBUG nova.network.neutron [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Updating instance_info_cache with network_info: [{"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.184 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.185 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.185 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:39:69:9c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.186 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Using config drive#033[00m
Jan 23 05:22:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:28 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.214 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.221 225706 DEBUG oslo_concurrency.lockutils [req-7aa7b5f5-8a1d-4b3b-a50f-15159e9e0526 req-d33d09b3-b3c7-4755-a54b-ffe8af031b12 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-b8ea49c6-5f62-47b0-92cc-7399bfc98528" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.522 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:28 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00008b50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:28.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.719 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Creating config drive at /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.726 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwxdhw6_t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.853 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwxdhw6_t" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.884 225706 DEBUG nova.storage.rbd_utils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:28 np0005593295 nova_compute[225701]: 2026-01-23 10:22:28.888 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.076 225706 DEBUG oslo_concurrency.processutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config b8ea49c6-5f62-47b0-92cc-7399bfc98528_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.077 225706 INFO nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Deleting local config drive /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528/disk.config because it was imported into RBD.#033[00m
Jan 23 05:22:29 np0005593295 systemd[1]: Starting libvirt secret daemon...
Jan 23 05:22:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:29.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:29 np0005593295 systemd[1]: Started libvirt secret daemon.
Jan 23 05:22:29 np0005593295 kernel: tap06aeb511-67: entered promiscuous mode
Jan 23 05:22:29 np0005593295 NetworkManager[48964]: <info>  [1769163749.1719] manager: (tap06aeb511-67): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 23 05:22:29 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:29Z|00037|binding|INFO|Claiming lport 06aeb511-67a6-4547-b061-9c4760285e3b for this chassis.
Jan 23 05:22:29 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:29Z|00038|binding|INFO|06aeb511-67a6-4547-b061-9c4760285e3b: Claiming fa:16:3e:39:69:9c 10.100.0.25
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.174 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.177 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.185 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:69:9c 10.100.0.25'], port_security=['fa:16:3e:39:69:9c 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': 'b8ea49c6-5f62-47b0-92cc-7399bfc98528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a09a282-aa22-47cf-a68d-ce0dba493868', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a9d12c65-6e30-4f8d-be47-424dc8b73a1d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dee54ab-ce3c-4b4e-ac76-15d1824a947d, chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=06aeb511-67a6-4547-b061-9c4760285e3b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.186 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 06aeb511-67a6-4547-b061-9c4760285e3b in datapath 6a09a282-aa22-47cf-a68d-ce0dba493868 bound to our chassis#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.188 142606 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a09a282-aa22-47cf-a68d-ce0dba493868#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.206 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc3b7a0-db6b-4fe6-92d5-424931dfaba9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.207 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a09a282-a1 in ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:22:29 np0005593295 systemd-machined[194368]: New machine qemu-2-instance-00000007.
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.209 229823 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a09a282-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.210 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[994d1fd9-c473-4a6c-ad2b-48da4d520f6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.210 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[ed323c88-2687-4baf-a761-07c3614dbd89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.216 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:29Z|00039|binding|INFO|Setting lport 06aeb511-67a6-4547-b061-9c4760285e3b ovn-installed in OVS
Jan 23 05:22:29 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:29Z|00040|binding|INFO|Setting lport 06aeb511-67a6-4547-b061-9c4760285e3b up in Southbound
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.221 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 systemd[1]: Started Virtual Machine qemu-2-instance-00000007.
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.227 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed09ec5-7143-4b01-939e-6bb989ee2ba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 systemd-udevd[232463]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:22:29 np0005593295 NetworkManager[48964]: <info>  [1769163749.2417] device (tap06aeb511-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:22:29 np0005593295 NetworkManager[48964]: <info>  [1769163749.2423] device (tap06aeb511-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.243 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[b584c21b-2138-4546-9e74-7596049bc21b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.274 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[dba45c58-3d96-400c-b746-5e7bf5dfc3f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.279 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[69446257-bf41-4ff6-bbdd-cf21151daa1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 NetworkManager[48964]: <info>  [1769163749.2816] manager: (tap6a09a282-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.312 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[0430c7e1-bfd7-4d2f-9f6a-5eea8b979138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.315 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[7a4dbbee-3d58-44c6-8b6d-27efdc5a8ddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 NetworkManager[48964]: <info>  [1769163749.3356] device (tap6a09a282-a0): carrier: link connected
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.341 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7dfbf1-4b31-4fe0-ab4c-6a20e7611c17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.356 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[d99023eb-843f-4a2a-9514-3ad3d757f417]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a09a282-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:9b:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490198, 'reachable_time': 30952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232494, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.370 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[0a29cf73-efb1-45dc-a294-7ba77a7759e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:9ba3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490198, 'tstamp': 490198}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232495, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.384 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[1bde2b59-c9f7-4ff3-9a7f-3dba601ca08e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a09a282-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:9b:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490198, 'reachable_time': 30952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232496, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.414 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[27b79cc3-53c6-43be-8328-fdd8b99ed94e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:29 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.468 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[b597dd07-4a0f-4698-95b7-241b1d3ab055]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.469 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a09a282-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.469 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.470 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a09a282-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:29 np0005593295 kernel: tap6a09a282-a0: entered promiscuous mode
Jan 23 05:22:29 np0005593295 NetworkManager[48964]: <info>  [1769163749.4724] manager: (tap6a09a282-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.471 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.473 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.474 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a09a282-a0, col_values=(('external_ids', {'iface-id': 'f3eaa8c6-94ad-445d-ab48-59e26f30c078'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.475 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:29Z|00041|binding|INFO|Releasing lport f3eaa8c6-94ad-445d-ab48-59e26f30c078 from this chassis (sb_readonly=0)
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.493 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.494 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.494 142606 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.495 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[94d333b0-67ba-41f4-a5ed-750bfb3471e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.496 142606 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: global
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    log         /dev/log local0 debug
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    log-tag     haproxy-metadata-proxy-6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    user        root
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    group       root
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    maxconn     1024
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    pidfile     /var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    daemon
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: defaults
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    log global
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    mode http
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    option httplog
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    option dontlognull
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    option http-server-close
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    option forwardfor
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    retries                 3
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    timeout http-request    30s
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    timeout connect         30s
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    timeout client          32s
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    timeout server          32s
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    timeout http-keep-alive 30s
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: listen listener
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    bind 169.254.169.254:80
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]:    http-request add-header X-OVN-Network-ID 6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:22:29 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:29.497 142606 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'env', 'PROCESS_TAG=haproxy-6a09a282-aa22-47cf-a68d-ce0dba493868', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a09a282-aa22-47cf-a68d-ce0dba493868.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.516 225706 DEBUG nova.compute.manager [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.516 225706 DEBUG oslo_concurrency.lockutils [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.517 225706 DEBUG oslo_concurrency.lockutils [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.517 225706 DEBUG oslo_concurrency.lockutils [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:29 np0005593295 nova_compute[225701]: 2026-01-23 10:22:29.517 225706 DEBUG nova.compute.manager [req-36c50f77-7ba3-4d8b-ada3-6c219e2293cb req-b0504862-8e59-4ddb-bde8-7de9421b3f62 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Processing event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:22:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:29 np0005593295 podman[232526]: 2026-01-23 10:22:29.850597953 +0000 UTC m=+0.046128653 container create dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:22:29 np0005593295 systemd[1]: Started libpod-conmon-dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a.scope.
Jan 23 05:22:29 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:22:29 np0005593295 podman[232526]: 2026-01-23 10:22:29.826652034 +0000 UTC m=+0.022182754 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:22:29 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e998d09fd3fbb881a05bc419ded091790e7655197f061ed433d1f7bd8e2ac3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:29 np0005593295 podman[232526]: 2026-01-23 10:22:29.942775995 +0000 UTC m=+0.138306715 container init dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:22:29 np0005593295 podman[232526]: 2026-01-23 10:22:29.948125213 +0000 UTC m=+0.143655913 container start dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 05:22:29 np0005593295 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [NOTICE]   (232562) : New worker (232566) forked
Jan 23 05:22:29 np0005593295 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [NOTICE]   (232562) : Loading success.
Jan 23 05:22:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.137 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.138 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163750.1371768, b8ea49c6-5f62-47b0-92cc-7399bfc98528 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.138 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] VM Started (Lifecycle Event)#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.141 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.144 225706 INFO nova.virt.libvirt.driver [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance spawned successfully.#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.144 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.161 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.167 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.171 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.171 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.172 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.172 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.173 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.173 225706 DEBUG nova.virt.libvirt.driver [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.202 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.203 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163750.1374104, b8ea49c6-5f62-47b0-92cc-7399bfc98528 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.203 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:22:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:30 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.225 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.228 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163750.1405203, b8ea49c6-5f62-47b0-92cc-7399bfc98528 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.228 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.239 225706 INFO nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Took 6.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.239 225706 DEBUG nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.250 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.253 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.283 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.303 225706 INFO nova.compute.manager [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Took 7.60 seconds to build instance.#033[00m
Jan 23 05:22:30 np0005593295 nova_compute[225701]: 2026-01-23 10:22:30.325 225706 DEBUG oslo_concurrency.lockutils [None req-6c836ff0-3c27-4807-abd2-ad760701bec0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:30 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:30.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:31.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:31 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00009470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:31 np0005593295 nova_compute[225701]: 2026-01-23 10:22:31.631 225706 DEBUG nova.compute.manager [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:31 np0005593295 nova_compute[225701]: 2026-01-23 10:22:31.632 225706 DEBUG oslo_concurrency.lockutils [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:31 np0005593295 nova_compute[225701]: 2026-01-23 10:22:31.633 225706 DEBUG oslo_concurrency.lockutils [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:31 np0005593295 nova_compute[225701]: 2026-01-23 10:22:31.633 225706 DEBUG oslo_concurrency.lockutils [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:31 np0005593295 nova_compute[225701]: 2026-01-23 10:22:31.633 225706 DEBUG nova.compute.manager [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] No waiting events found dispatching network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:31 np0005593295 nova_compute[225701]: 2026-01-23 10:22:31.633 225706 WARNING nova.compute.manager [req-80ad4137-f60b-490f-ba41-a2a7113a12a6 req-98ece8f4-a4e2-4d5a-a88d-fc93a449e5b5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received unexpected event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b for instance with vm_state active and task_state None.#033[00m
Jan 23 05:22:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:32 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:32 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:32.270 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:32 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:32.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:32 np0005593295 nova_compute[225701]: 2026-01-23 10:22:32.980 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:33.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:33 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:33 np0005593295 nova_compute[225701]: 2026-01-23 10:22:33.524 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:34 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce00009470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:34 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:34.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:22:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:35.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:22:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:35 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:36 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:36 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce0000a180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:36.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:37.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:37 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:37 np0005593295 nova_compute[225701]: 2026-01-23 10:22:37.983 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:38 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:38 np0005593295 nova_compute[225701]: 2026-01-23 10:22:38.525 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:38 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:38.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:39.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:39 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce0000a180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:40 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:40 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:22:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:40.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:22:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:22:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:41.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:22:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:41 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcddc003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:42 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce0000a180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:42 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:42.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:42 np0005593295 nova_compute[225701]: 2026-01-23 10:22:42.985 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:43.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:43 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcde4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:22:43 np0005593295 nova_compute[225701]: 2026-01-23 10:22:43.528 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:43 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:43Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:69:9c 10.100.0.25
Jan 23 05:22:43 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:43Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:69:9c 10.100.0.25
Jan 23 05:22:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:44 np0005593295 kernel: ganesha.nfsd[232028]: segfault at 50 ip 00007fce8c07632e sp 00007fce127fb210 error 4 in libntirpc.so.5.8[7fce8c05b000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 23 05:22:44 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:22:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[231901]: 23/01/2026 10:22:44 : epoch 69734bcc : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fcdf8003cc0 fd 39 proxy ignored for local
Jan 23 05:22:44 np0005593295 systemd[1]: Started Process Core Dump (PID 232614/UID 0).
Jan 23 05:22:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:22:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:44.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:22:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:45.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:45 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:22:45 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:22:45 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:22:45 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:22:45 np0005593295 systemd-coredump[232615]: Process 231905 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007fce8c07632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:22:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:46 np0005593295 systemd[1]: systemd-coredump@12-232614-0.service: Deactivated successfully.
Jan 23 05:22:46 np0005593295 systemd[1]: systemd-coredump@12-232614-0.service: Consumed 1.770s CPU time.
Jan 23 05:22:46 np0005593295 podman[232729]: 2026-01-23 10:22:46.135171732 +0000 UTC m=+0.026270122 container died f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:22:46 np0005593295 systemd[1]: var-lib-containers-storage-overlay-b999db203518d32001aadbecd54d65860121f1de2ce596582814e75cf37a5783-merged.mount: Deactivated successfully.
Jan 23 05:22:46 np0005593295 podman[232729]: 2026-01-23 10:22:46.176343276 +0000 UTC m=+0.067441656 container remove f37b9193829c226bd3c386457514070a84ef21e291b2b2ba15bdbe8360f58f39 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:22:46 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:22:46 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:22:46 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.533s CPU time.
Jan 23 05:22:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:22:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:46.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:22:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:22:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:47.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:22:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:47 np0005593295 nova_compute[225701]: 2026-01-23 10:22:47.988 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:48 np0005593295 nova_compute[225701]: 2026-01-23 10:22:48.529 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 05:22:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2595657408' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:22:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 05:22:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2595657408' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:22:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:48.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:49.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:49 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:22:49 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:22:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102250 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:22:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:22:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:50.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:22:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:22:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:51.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:22:51 np0005593295 podman[232803]: 2026-01-23 10:22:51.635823462 +0000 UTC m=+0.054810959 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 05:22:51 np0005593295 podman[232802]: 2026-01-23 10:22:51.665604519 +0000 UTC m=+0.092184691 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:22:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:22:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:52.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:22:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:52 np0005593295 nova_compute[225701]: 2026-01-23 10:22:52.990 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:53.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:53 np0005593295 nova_compute[225701]: 2026-01-23 10:22:53.532 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:54.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:22:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:55.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:22:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:55.489 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:55.490 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:55.490 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.401 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.402 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.402 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.402 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.403 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.405 225706 INFO nova.compute.manager [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Terminating instance#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.406 225706 DEBUG nova.compute.manager [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:22:56 np0005593295 kernel: tap06aeb511-67 (unregistering): left promiscuous mode
Jan 23 05:22:56 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 13.
Jan 23 05:22:56 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:22:56 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.533s CPU time.
Jan 23 05:22:56 np0005593295 NetworkManager[48964]: <info>  [1769163776.4793] device (tap06aeb511-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:22:56 np0005593295 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 05:22:56 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:56Z|00042|binding|INFO|Releasing lport 06aeb511-67a6-4547-b061-9c4760285e3b from this chassis (sb_readonly=0)
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.493 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:56 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:56Z|00043|binding|INFO|Setting lport 06aeb511-67a6-4547-b061-9c4760285e3b down in Southbound
Jan 23 05:22:56 np0005593295 ovn_controller[132789]: 2026-01-23T10:22:56Z|00044|binding|INFO|Removing iface tap06aeb511-67 ovn-installed in OVS
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.500 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:69:9c 10.100.0.25'], port_security=['fa:16:3e:39:69:9c 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': 'b8ea49c6-5f62-47b0-92cc-7399bfc98528', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a09a282-aa22-47cf-a68d-ce0dba493868', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9d12c65-6e30-4f8d-be47-424dc8b73a1d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dee54ab-ce3c-4b4e-ac76-15d1824a947d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=06aeb511-67a6-4547-b061-9c4760285e3b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.501 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 06aeb511-67a6-4547-b061-9c4760285e3b in datapath 6a09a282-aa22-47cf-a68d-ce0dba493868 unbound from our chassis#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.502 142606 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a09a282-aa22-47cf-a68d-ce0dba493868, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.503 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[f77d59dd-e119-4549-928d-e5600508c9a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.504 142606 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 namespace which is not needed anymore#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.511 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:56 np0005593295 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 23 05:22:56 np0005593295 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Consumed 13.750s CPU time.
Jan 23 05:22:56 np0005593295 systemd-machined[194368]: Machine qemu-2-instance-00000007 terminated.
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.657 225706 INFO nova.virt.libvirt.driver [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Instance destroyed successfully.#033[00m
Jan 23 05:22:56 np0005593295 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [NOTICE]   (232562) : haproxy version is 2.8.14-c23fe91
Jan 23 05:22:56 np0005593295 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [NOTICE]   (232562) : path to executable is /usr/sbin/haproxy
Jan 23 05:22:56 np0005593295 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [WARNING]  (232562) : Exiting Master process...
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.658 225706 DEBUG nova.objects.instance [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid b8ea49c6-5f62-47b0-92cc-7399bfc98528 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:56 np0005593295 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [WARNING]  (232562) : Exiting Master process...
Jan 23 05:22:56 np0005593295 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [ALERT]    (232562) : Current worker (232566) exited with code 143 (Terminated)
Jan 23 05:22:56 np0005593295 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[232542]: [WARNING]  (232562) : All workers exited. Exiting... (0)
Jan 23 05:22:56 np0005593295 systemd[1]: libpod-dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a.scope: Deactivated successfully.
Jan 23 05:22:56 np0005593295 podman[232897]: 2026-01-23 10:22:56.670533778 +0000 UTC m=+0.059751419 container died dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.674 225706 DEBUG nova.virt.libvirt.vif [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1878761076',display_name='tempest-TestNetworkBasicOps-server-1878761076',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1878761076',id=7,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCnVJwsrFV8aI2nrJoEZl6VyyMwYmX81xfzmKsfGpDRm0DGXIQaGmDmPINRbdeF1kx8Y5VA3JSgU3fPoWzBbPsDeXm0p5hq8BrMWr1cPqMrGzO08egHCDlwB5XDUgBL1OA==',key_name='tempest-TestNetworkBasicOps-1503023412',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-qvac9ktg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:30Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=b8ea49c6-5f62-47b0-92cc-7399bfc98528,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.675 225706 DEBUG nova.network.os_vif_util [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "06aeb511-67a6-4547-b061-9c4760285e3b", "address": "fa:16:3e:39:69:9c", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06aeb511-67", "ovs_interfaceid": "06aeb511-67a6-4547-b061-9c4760285e3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.676 225706 DEBUG nova.network.os_vif_util [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.676 225706 DEBUG os_vif [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.680 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.681 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06aeb511-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.685 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.687 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.692 225706 INFO os_vif [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:69:9c,bridge_name='br-int',has_traffic_filtering=True,id=06aeb511-67a6-4547-b061-9c4760285e3b,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06aeb511-67')#033[00m
Jan 23 05:22:56 np0005593295 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a-userdata-shm.mount: Deactivated successfully.
Jan 23 05:22:56 np0005593295 systemd[1]: var-lib-containers-storage-overlay-47e998d09fd3fbb881a05bc419ded091790e7655197f061ed433d1f7bd8e2ac3-merged.mount: Deactivated successfully.
Jan 23 05:22:56 np0005593295 podman[232897]: 2026-01-23 10:22:56.716636803 +0000 UTC m=+0.105854434 container cleanup dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 05:22:56 np0005593295 systemd[1]: libpod-conmon-dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a.scope: Deactivated successfully.
Jan 23 05:22:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:56.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:56 np0005593295 podman[232964]: 2026-01-23 10:22:56.758494915 +0000 UTC m=+0.040291675 container create 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 05:22:56 np0005593295 podman[232980]: 2026-01-23 10:22:56.793148711 +0000 UTC m=+0.050979666 container remove dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 05:22:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:56 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.tykohi-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.798 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[bc79a4f3-fca8-4b96-9cfc-b3808de9e1bf]: (4, ('Fri Jan 23 10:22:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 (dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a)\ndc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a\nFri Jan 23 10:22:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 (dc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a)\ndc94e18b0361207bf24082429588711e70507727f525be660ea0472b79ee150a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.800 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[3cafc1bb-53cf-4ffc-8d0d-9a636a084a3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.801 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a09a282-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.802 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:56 np0005593295 kernel: tap6a09a282-a0: left promiscuous mode
Jan 23 05:22:56 np0005593295 podman[232964]: 2026-01-23 10:22:56.812439482 +0000 UTC m=+0.094236272 container init 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:22:56 np0005593295 nova_compute[225701]: 2026-01-23 10:22:56.820 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:56 np0005593295 podman[232964]: 2026-01-23 10:22:56.82305101 +0000 UTC m=+0.104847770 container start 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.823 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[ef75d7a4-3a7d-4505-9c6a-64fad7bf6b50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:56 np0005593295 bash[232964]: 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a
Jan 23 05:22:56 np0005593295 podman[232964]: 2026-01-23 10:22:56.742452364 +0000 UTC m=+0.024249134 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 05:22:56 np0005593295 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.839 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b3b907-b788-4c09-b943-2d4b7c040085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.840 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[fd7e422f-64fc-4397-9419-d36eaf343fc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.852 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[a90f14d8-e5f8-45d2-8953-8c22b7ada38b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490191, 'reachable_time': 26342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233006, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.855 142723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:22:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:22:56.855 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1d1968-0a23-47a7-8032-6bff95743003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:56 np0005593295 systemd[1]: run-netns-ovnmeta\x2d6a09a282\x2daa22\x2d47cf\x2da68d\x2dce0dba493868.mount: Deactivated successfully.
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 05:22:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:22:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:22:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:22:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:57.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.314 225706 INFO nova.virt.libvirt.driver [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Deleting instance files /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528_del#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.315 225706 INFO nova.virt.libvirt.driver [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Deletion of /var/lib/nova/instances/b8ea49c6-5f62-47b0-92cc-7399bfc98528_del complete#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.396 225706 DEBUG nova.compute.manager [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-unplugged-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.396 225706 DEBUG oslo_concurrency.lockutils [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.397 225706 DEBUG oslo_concurrency.lockutils [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.398 225706 DEBUG oslo_concurrency.lockutils [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.398 225706 DEBUG nova.compute.manager [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] No waiting events found dispatching network-vif-unplugged-06aeb511-67a6-4547-b061-9c4760285e3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.399 225706 DEBUG nova.compute.manager [req-08105b5e-3660-4f28-b52d-cc77f85a3848 req-7adeba5c-36c5-4aad-b4a4-8254cec91f7e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-unplugged-06aeb511-67a6-4547-b061-9c4760285e3b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.400 225706 INFO nova.compute.manager [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Took 0.99 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.401 225706 DEBUG oslo.service.loopingcall [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.402 225706 DEBUG nova.compute.manager [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:22:57 np0005593295 nova_compute[225701]: 2026-01-23 10:22:57.402 225706 DEBUG nova.network.neutron [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:22:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:58 np0005593295 nova_compute[225701]: 2026-01-23 10:22:58.534 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:58.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:22:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:22:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:22:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:59.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:59 np0005593295 nova_compute[225701]: 2026-01-23 10:22:59.481 225706 DEBUG nova.compute.manager [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:59 np0005593295 nova_compute[225701]: 2026-01-23 10:22:59.481 225706 DEBUG oslo_concurrency.lockutils [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:59 np0005593295 nova_compute[225701]: 2026-01-23 10:22:59.482 225706 DEBUG oslo_concurrency.lockutils [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:59 np0005593295 nova_compute[225701]: 2026-01-23 10:22:59.482 225706 DEBUG oslo_concurrency.lockutils [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:59 np0005593295 nova_compute[225701]: 2026-01-23 10:22:59.482 225706 DEBUG nova.compute.manager [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] No waiting events found dispatching network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:59 np0005593295 nova_compute[225701]: 2026-01-23 10:22:59.482 225706 WARNING nova.compute.manager [req-fb65a6f4-dc43-4684-9c29-9f866fbc0387 req-fe21e270-5e8d-4008-887e-8601acc93f1c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received unexpected event network-vif-plugged-06aeb511-67a6-4547-b061-9c4760285e3b for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:22:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:22:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.481 225706 DEBUG nova.network.neutron [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.502 225706 INFO nova.compute.manager [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Took 3.10 seconds to deallocate network for instance.#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.541 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.542 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.549 225706 DEBUG nova.compute.manager [req-21b59a28-e133-4c8b-8f41-06fd329525d6 req-daa71949-13c6-4d8c-85b2-100e939a5b20 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Received event network-vif-deleted-06aeb511-67a6-4547-b061-9c4760285e3b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.572 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.586 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.586 225706 DEBUG nova.compute.provider_tree [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.597 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.621 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.651 225706 DEBUG oslo_concurrency.processutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:00.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:00 np0005593295 nova_compute[225701]: 2026-01-23 10:23:00.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:23:01 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2480691247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:23:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:01.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:23:01 np0005593295 nova_compute[225701]: 2026-01-23 10:23:01.151 225706 DEBUG oslo_concurrency.processutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:01 np0005593295 nova_compute[225701]: 2026-01-23 10:23:01.156 225706 DEBUG nova.compute.provider_tree [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:01 np0005593295 nova_compute[225701]: 2026-01-23 10:23:01.189 225706 DEBUG nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:01 np0005593295 nova_compute[225701]: 2026-01-23 10:23:01.221 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:01 np0005593295 nova_compute[225701]: 2026-01-23 10:23:01.285 225706 INFO nova.scheduler.client.report [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance b8ea49c6-5f62-47b0-92cc-7399bfc98528#033[00m
Jan 23 05:23:01 np0005593295 nova_compute[225701]: 2026-01-23 10:23:01.360 225706 DEBUG oslo_concurrency.lockutils [None req-145fb96e-2591-43f6-ad73-c980f5294a22 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "b8ea49c6-5f62-47b0-92cc-7399bfc98528" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.959s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:01 np0005593295 nova_compute[225701]: 2026-01-23 10:23:01.685 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:02.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:02 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:23:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:02 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:23:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:03.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:03 np0005593295 nova_compute[225701]: 2026-01-23 10:23:03.661 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:03 np0005593295 nova_compute[225701]: 2026-01-23 10:23:03.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:03 np0005593295 nova_compute[225701]: 2026-01-23 10:23:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:23:03 np0005593295 nova_compute[225701]: 2026-01-23 10:23:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:23:03 np0005593295 nova_compute[225701]: 2026-01-23 10:23:03.809 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:23:03 np0005593295 nova_compute[225701]: 2026-01-23 10:23:03.810 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:23:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:04.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:23:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:05.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:06 np0005593295 nova_compute[225701]: 2026-01-23 10:23:06.688 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:06.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:06 np0005593295 nova_compute[225701]: 2026-01-23 10:23:06.805 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:07.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:07 np0005593295 nova_compute[225701]: 2026-01-23 10:23:07.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:07 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.261 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.261 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.262 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.262 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.263 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.664 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:23:08 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/644210232' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:08.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.752 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.907 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.908 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4856MB free_disk=59.942562103271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.908 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:08 np0005593295 nova_compute[225701]: 2026-01-23 10:23:08.909 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:23:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 05:23:09 np0005593295 nova_compute[225701]: 2026-01-23 10:23:09.063 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:09.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:09 np0005593295 nova_compute[225701]: 2026-01-23 10:23:09.300 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:23:09 np0005593295 nova_compute[225701]: 2026-01-23 10:23:09.300 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:23:09 np0005593295 nova_compute[225701]: 2026-01-23 10:23:09.318 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbef0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:23:09 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1432565687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:09 np0005593295 nova_compute[225701]: 2026-01-23 10:23:09.747 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:09 np0005593295 nova_compute[225701]: 2026-01-23 10:23:09.752 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:10 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc0016c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:10 np0005593295 nova_compute[225701]: 2026-01-23 10:23:10.632 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:10 np0005593295 nova_compute[225701]: 2026-01-23 10:23:10.659 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:23:10 np0005593295 nova_compute[225701]: 2026-01-23 10:23:10.659 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:10 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:10.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:11.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:11 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:11 np0005593295 nova_compute[225701]: 2026-01-23 10:23:11.655 225706 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163776.6524646, b8ea49c6-5f62-47b0-92cc-7399bfc98528 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:11 np0005593295 nova_compute[225701]: 2026-01-23 10:23:11.655 225706 INFO nova.compute.manager [-] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:23:11 np0005593295 nova_compute[225701]: 2026-01-23 10:23:11.689 225706 DEBUG nova.compute.manager [None req-dd69c3ed-56ee-42a9-8f93-f3a4b28879e1 - - - - - -] [instance: b8ea49c6-5f62-47b0-92cc-7399bfc98528] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:11 np0005593295 nova_compute[225701]: 2026-01-23 10:23:11.691 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:12 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102312 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:23:12 np0005593295 nova_compute[225701]: 2026-01-23 10:23:12.660 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:12 np0005593295 nova_compute[225701]: 2026-01-23 10:23:12.661 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:12 np0005593295 nova_compute[225701]: 2026-01-23 10:23:12.661 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:12 np0005593295 nova_compute[225701]: 2026-01-23 10:23:12.662 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:12 np0005593295 nova_compute[225701]: 2026-01-23 10:23:12.662 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:12 np0005593295 nova_compute[225701]: 2026-01-23 10:23:12.662 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:23:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:12 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:12.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:13.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:13 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:13 np0005593295 nova_compute[225701]: 2026-01-23 10:23:13.666 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:14 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:14 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:14.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:15.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:15 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:16 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:16 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:16 np0005593295 nova_compute[225701]: 2026-01-23 10:23:16.694 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:23:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:16.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:23:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:17.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:17 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:18 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:18 np0005593295 nova_compute[225701]: 2026-01-23 10:23:18.668 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:18 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:18.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:19.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:19 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:20 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:20 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:20.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:21.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:21 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:21 np0005593295 nova_compute[225701]: 2026-01-23 10:23:21.697 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:22 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:22 np0005593295 podman[233177]: 2026-01-23 10:23:22.341592204 +0000 UTC m=+0.057224716 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 05:23:22 np0005593295 podman[233176]: 2026-01-23 10:23:22.408822695 +0000 UTC m=+0.128955607 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:23:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:22 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:22.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:23.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:23 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:23 np0005593295 nova_compute[225701]: 2026-01-23 10:23:23.670 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:24 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:24 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:24.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:25.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:25 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:26 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:26 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:26 np0005593295 nova_compute[225701]: 2026-01-23 10:23:26.701 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:26.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:27.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:27 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:28 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:28 np0005593295 nova_compute[225701]: 2026-01-23 10:23:28.672 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:28 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:28.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:29.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:29 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:30 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:30 np0005593295 nova_compute[225701]: 2026-01-23 10:23:30.518 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:30 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:23:30.518 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:30 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:23:30.520 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:23:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:30 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:30.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:31.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:31 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:31 np0005593295 nova_compute[225701]: 2026-01-23 10:23:31.703 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:32 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:32 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:23:32.524 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:32 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:33.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:33 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:33 np0005593295 nova_compute[225701]: 2026-01-23 10:23:33.674 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:34 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:34 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102334 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:23:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:35 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:36 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:36 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:36 np0005593295 nova_compute[225701]: 2026-01-23 10:23:36.705 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:36.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:37.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:37 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:38 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:38 np0005593295 nova_compute[225701]: 2026-01-23 10:23:38.674 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:38 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:38.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:39.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:39 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbecc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:40 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:40 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:41.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:41 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:41 np0005593295 nova_compute[225701]: 2026-01-23 10:23:41.760 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.870980) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821871203, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1624, "num_deletes": 505, "total_data_size": 3294691, "memory_usage": 3352160, "flush_reason": "Manual Compaction"}
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821883003, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1383209, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28353, "largest_seqno": 29972, "table_properties": {"data_size": 1377972, "index_size": 2057, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16001, "raw_average_key_size": 19, "raw_value_size": 1364717, "raw_average_value_size": 1630, "num_data_blocks": 90, "num_entries": 837, "num_filter_entries": 837, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163707, "oldest_key_time": 1769163707, "file_creation_time": 1769163821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 12055 microseconds, and 5686 cpu microseconds.
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.883096) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1383209 bytes OK
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.883133) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.884942) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.884963) EVENT_LOG_v1 {"time_micros": 1769163821884959, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.884981) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3286298, prev total WAL file size 3286298, number of live WAL files 2.
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.886084) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353130' seq:72057594037927935, type:22 .. '6C6F676D00373631' seq:0, type:0; will stop at (end)
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1350KB)], [54(14MB)]
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821886218, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16235185, "oldest_snapshot_seqno": -1}
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5783 keys, 12761257 bytes, temperature: kUnknown
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821966194, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 12761257, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12723894, "index_size": 21829, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14469, "raw_key_size": 149269, "raw_average_key_size": 25, "raw_value_size": 12620398, "raw_average_value_size": 2182, "num_data_blocks": 877, "num_entries": 5783, "num_filter_entries": 5783, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.966564) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 12761257 bytes
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.968263) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.7 rd, 159.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.2 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(21.0) write-amplify(9.2) OK, records in: 6761, records dropped: 978 output_compression: NoCompression
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.968288) EVENT_LOG_v1 {"time_micros": 1769163821968276, "job": 32, "event": "compaction_finished", "compaction_time_micros": 80083, "compaction_time_cpu_micros": 30584, "output_level": 6, "num_output_files": 1, "total_output_size": 12761257, "num_input_records": 6761, "num_output_records": 5783, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821968792, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821972346, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.885886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:41 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:23:41.972420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:42 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:42 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 05:23:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:42 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:42.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:43.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:43 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:43 np0005593295 nova_compute[225701]: 2026-01-23 10:23:43.676 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:44 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:44 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:44.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:45.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:45 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:45 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 05:23:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:45 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 05:23:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:46 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:46 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:46 np0005593295 nova_compute[225701]: 2026-01-23 10:23:46.763 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:46.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:47.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:47 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:48 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:48 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 05:23:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 05:23:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1571219015' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:23:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 05:23:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1571219015' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:23:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:48 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:48 np0005593295 nova_compute[225701]: 2026-01-23 10:23:48.738 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:23:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:48.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:23:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:49.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:49 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:50 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:50 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:50.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:51 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:23:51 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:23:51 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:23:51 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:23:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:51.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:51 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:51 np0005593295 nova_compute[225701]: 2026-01-23 10:23:51.766 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:52 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:52 np0005593295 podman[233389]: 2026-01-23 10:23:52.662595896 +0000 UTC m=+0.072740857 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:23:52 np0005593295 podman[233388]: 2026-01-23 10:23:52.698709302 +0000 UTC m=+0.107646074 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 05:23:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:52 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:52.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:23:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:53.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:23:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:53 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:53 np0005593295 nova_compute[225701]: 2026-01-23 10:23:53.742 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:54 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:54 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102354 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 05:23:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:23:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:54.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:23:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:55.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:23:55.491 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:23:55.491 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:23:55.492 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:55 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:55 np0005593295 ovn_controller[132789]: 2026-01-23T10:23:55Z|00045|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 23 05:23:55 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:23:55 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:23:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:56 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:56 np0005593295 nova_compute[225701]: 2026-01-23 10:23:56.770 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:56.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:57.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:57 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:58 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:58 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:58 np0005593295 nova_compute[225701]: 2026-01-23 10:23:58.744 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:58.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:23:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:23:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:23:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:59.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:23:59 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:23:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:23:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:00 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:00 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:00 np0005593295 nova_compute[225701]: 2026-01-23 10:24:00.780 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.003000072s ======
Jan 23 05:24:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:00.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000072s
Jan 23 05:24:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:01.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:01 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:01 np0005593295 nova_compute[225701]: 2026-01-23 10:24:01.773 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:02 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:02 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:02.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:03.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:03 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:03 np0005593295 nova_compute[225701]: 2026-01-23 10:24:03.747 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:03 np0005593295 nova_compute[225701]: 2026-01-23 10:24:03.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:03 np0005593295 nova_compute[225701]: 2026-01-23 10:24:03.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:24:03 np0005593295 nova_compute[225701]: 2026-01-23 10:24:03.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:24:03 np0005593295 nova_compute[225701]: 2026-01-23 10:24:03.802 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:24:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:04 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:04 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:04.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:05.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.349 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.349 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.377 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.485 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.486 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.494 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.494 225706 INFO nova.compute.claims [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:24:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:05 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.630 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:05 np0005593295 nova_compute[225701]: 2026-01-23 10:24:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:24:06 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3596100017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.088 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.096 225706 DEBUG nova.compute.provider_tree [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.119 225706 DEBUG nova.scheduler.client.report [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.138 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.139 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.256 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.257 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:24:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:06 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.316 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.339 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.434 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.435 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.435 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Creating image(s)#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.459 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.481 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.503 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.507 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.561 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.562 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.562 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.562 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.582 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.586 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c bae2b00f-87e8-40b7-b7ba-972f7c531998_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:06 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.769 225706 DEBUG nova.policy [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:24:06 np0005593295 nova_compute[225701]: 2026-01-23 10:24:06.775 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:06.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:07.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:07 np0005593295 nova_compute[225701]: 2026-01-23 10:24:07.386 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c bae2b00f-87e8-40b7-b7ba-972f7c531998_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.800s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:07 np0005593295 nova_compute[225701]: 2026-01-23 10:24:07.465 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:24:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:07 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec4003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:07 np0005593295 nova_compute[225701]: 2026-01-23 10:24:07.570 225706 DEBUG nova.objects.instance [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid bae2b00f-87e8-40b7-b7ba-972f7c531998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:07 np0005593295 nova_compute[225701]: 2026-01-23 10:24:07.590 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:24:07 np0005593295 nova_compute[225701]: 2026-01-23 10:24:07.591 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Ensure instance console log exists: /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:24:07 np0005593295 nova_compute[225701]: 2026-01-23 10:24:07.592 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:07 np0005593295 nova_compute[225701]: 2026-01-23 10:24:07.592 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:07 np0005593295 nova_compute[225701]: 2026-01-23 10:24:07.593 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbed0004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:08 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbec8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:08 np0005593295 nova_compute[225701]: 2026-01-23 10:24:08.749 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:08.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.055 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Successfully updated port: d744a552-c706-444a-8a15-4a98c41eed50 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.075 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.075 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.075 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:24:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.172 225706 DEBUG nova.compute.manager [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-changed-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.172 225706 DEBUG nova.compute.manager [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Refreshing instance network info cache due to event network-changed-d744a552-c706-444a-8a15-4a98c41eed50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.172 225706 DEBUG oslo_concurrency.lockutils [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:09.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:09 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.576 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.807 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.808 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.808 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.808 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:24:09 np0005593295 nova_compute[225701]: 2026-01-23 10:24:09.808 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:24:10 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2577585059' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.243 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi[232998]: 23/01/2026 10:24:10 : epoch 69734c00 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbedc001fe0 fd 39 proxy ignored for local
Jan 23 05:24:10 np0005593295 kernel: ganesha.nfsd[233132]: segfault at 50 ip 00007fbf73aa632e sp 00007fbef8ff8210 error 4 in libntirpc.so.5.8[7fbf73a8b000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 23 05:24:10 np0005593295 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 05:24:10 np0005593295 systemd[1]: Started Process Core Dump (PID 233709/UID 0).
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.393 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.394 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4884MB free_disk=59.97146224975586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.394 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.394 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.462 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Instance bae2b00f-87e8-40b7-b7ba-972f7c531998 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.463 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.463 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.513 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.534 225706 DEBUG nova.network.neutron [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Updating instance_info_cache with network_info: [{"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.558 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.559 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance network_info: |[{"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.559 225706 DEBUG oslo_concurrency.lockutils [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.559 225706 DEBUG nova.network.neutron [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Refreshing network info cache for port d744a552-c706-444a-8a15-4a98c41eed50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.563 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start _get_guest_xml network_info=[{"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.568 225706 WARNING nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.573 225706 DEBUG nova.virt.libvirt.host [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.574 225706 DEBUG nova.virt.libvirt.host [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.577 225706 DEBUG nova.virt.libvirt.host [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.577 225706 DEBUG nova.virt.libvirt.host [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.578 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.578 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.579 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.579 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.579 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.580 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.580 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.580 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.581 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.581 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.581 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.581 225706 DEBUG nova.virt.hardware [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.586 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:10.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:24:10 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/287250540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.976 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:10 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.981 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:10.999 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:24:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:24:11 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/308211205' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.029 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.030 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.030 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.061 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.065 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:11.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:24:11 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/469737160' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:24:11 np0005593295 systemd-coredump[233710]: Process 233002 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007fbf73aa632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.521 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.523 225706 DEBUG nova.virt.libvirt.vif [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1790835147',display_name='tempest-TestNetworkBasicOps-server-1790835147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1790835147',id=9,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLwh/ci1qy20vB5FyaBepDv6KpYIxs8h6oo7gGlHu7RZtK7kr5mjuHzqdrX+yDa6v1DJrzMXWjaBuQGyTdeFGY8MLFkkRTd0XB8VJHoKHx7kcuI7EyiJu2dhMv2/NI1ZTg==',key_name='tempest-TestNetworkBasicOps-520442326',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-ox8kizyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:24:06Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=bae2b00f-87e8-40b7-b7ba-972f7c531998,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.523 225706 DEBUG nova.network.os_vif_util [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.524 225706 DEBUG nova.network.os_vif_util [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.526 225706 DEBUG nova.objects.instance [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid bae2b00f-87e8-40b7-b7ba-972f7c531998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:11 np0005593295 systemd[1]: systemd-coredump@13-233709-0.service: Deactivated successfully.
Jan 23 05:24:11 np0005593295 systemd[1]: systemd-coredump@13-233709-0.service: Consumed 1.197s CPU time.
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.639 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <uuid>bae2b00f-87e8-40b7-b7ba-972f7c531998</uuid>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <name>instance-00000009</name>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <memory>131072</memory>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <vcpu>1</vcpu>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <metadata>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <nova:name>tempest-TestNetworkBasicOps-server-1790835147</nova:name>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <nova:creationTime>2026-01-23 10:24:10</nova:creationTime>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <nova:flavor name="m1.nano">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <nova:memory>128</nova:memory>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <nova:disk>1</nova:disk>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <nova:swap>0</nova:swap>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      </nova:flavor>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <nova:owner>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      </nova:owner>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <nova:ports>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <nova:port uuid="d744a552-c706-444a-8a15-4a98c41eed50">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        </nova:port>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      </nova:ports>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </nova:instance>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  </metadata>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <sysinfo type="smbios">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <system>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <entry name="serial">bae2b00f-87e8-40b7-b7ba-972f7c531998</entry>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <entry name="uuid">bae2b00f-87e8-40b7-b7ba-972f7c531998</entry>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </system>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  </sysinfo>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <os>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <boot dev="hd"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <smbios mode="sysinfo"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  </os>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <features>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <acpi/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <apic/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <vmcoreinfo/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  </features>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <clock offset="utc">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <timer name="hpet" present="no"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  </clock>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <cpu mode="host-model" match="exact">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  </cpu>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  <devices>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <disk type="network" device="disk">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <driver type="raw" cache="none"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <source protocol="rbd" name="vms/bae2b00f-87e8-40b7-b7ba-972f7c531998_disk">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      </source>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <auth username="openstack">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      </auth>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <target dev="vda" bus="virtio"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <disk type="network" device="cdrom">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <driver type="raw" cache="none"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <source protocol="rbd" name="vms/bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      </source>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <auth username="openstack">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      </auth>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <target dev="sda" bus="sata"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <interface type="ethernet">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <mac address="fa:16:3e:9f:48:6d"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <model type="virtio"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <mtu size="1442"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <target dev="tapd744a552-c7"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </interface>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <serial type="pty">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <log file="/var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/console.log" append="off"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </serial>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <video>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <model type="virtio"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </video>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <input type="tablet" bus="usb"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <rng model="virtio">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </rng>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <controller type="usb" index="0"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    <memballoon model="virtio">
Jan 23 05:24:11 np0005593295 nova_compute[225701]:      <stats period="10"/>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:    </memballoon>
Jan 23 05:24:11 np0005593295 nova_compute[225701]:  </devices>
Jan 23 05:24:11 np0005593295 nova_compute[225701]: </domain>
Jan 23 05:24:11 np0005593295 nova_compute[225701]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.639 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Preparing to wait for external event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.640 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.640 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.640 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.641 225706 DEBUG nova.virt.libvirt.vif [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1790835147',display_name='tempest-TestNetworkBasicOps-server-1790835147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1790835147',id=9,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLwh/ci1qy20vB5FyaBepDv6KpYIxs8h6oo7gGlHu7RZtK7kr5mjuHzqdrX+yDa6v1DJrzMXWjaBuQGyTdeFGY8MLFkkRTd0XB8VJHoKHx7kcuI7EyiJu2dhMv2/NI1ZTg==',key_name='tempest-TestNetworkBasicOps-520442326',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-ox8kizyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:24:06Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=bae2b00f-87e8-40b7-b7ba-972f7c531998,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.641 225706 DEBUG nova.network.os_vif_util [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.642 225706 DEBUG nova.network.os_vif_util [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.642 225706 DEBUG os_vif [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.643 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.643 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.644 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.648 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.649 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd744a552-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.650 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd744a552-c7, col_values=(('external_ids', {'iface-id': 'd744a552-c706-444a-8a15-4a98c41eed50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:48:6d', 'vm-uuid': 'bae2b00f-87e8-40b7-b7ba-972f7c531998'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:11 np0005593295 NetworkManager[48964]: <info>  [1769163851.6541] manager: (tapd744a552-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.654 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.656 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.660 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:11 np0005593295 podman[233801]: 2026-01-23 10:24:11.662471141 +0000 UTC m=+0.031354361 container died 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.661 225706 INFO os_vif [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7')#033[00m
Jan 23 05:24:11 np0005593295 systemd[1]: var-lib-containers-storage-overlay-5c9bf5cbdb826cd487d0a518ff7649bfcd11428b56788cb34e05d1a88f76de1b-merged.mount: Deactivated successfully.
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.740 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.741 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.741 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:9f:48:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.741 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Using config drive#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.767 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.934 225706 DEBUG nova.network.neutron [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Updated VIF entry in instance network info cache for port d744a552-c706-444a-8a15-4a98c41eed50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.935 225706 DEBUG nova.network.neutron [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Updating instance_info_cache with network_info: [{"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:11 np0005593295 nova_compute[225701]: 2026-01-23 10:24:11.950 225706 DEBUG oslo_concurrency.lockutils [req-18cca381-cee9-4466-8a0e-9fc59ae8e7c8 req-304d948e-3dea-4dbb-bd63-4985b25c0a55 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-bae2b00f-87e8-40b7-b7ba-972f7c531998" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:12 np0005593295 podman[233801]: 2026-01-23 10:24:12.029512693 +0000 UTC m=+0.398395913 container remove 00f7620c8686931b35559136526f1ddfd77324ee269c01ffbfe8698ca081684a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-1-0-compute-2-tykohi, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.029 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.030 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.030 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.030 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:24:12 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Main process exited, code=exited, status=139/n/a
Jan 23 05:24:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.164 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Creating config drive at /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.176 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz3mh5p61 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:12 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:24:12 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.599s CPU time.
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.306 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz3mh5p61" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.336 225706 DEBUG nova.storage.rbd_utils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.340 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.492 225706 DEBUG oslo_concurrency.processutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config bae2b00f-87e8-40b7-b7ba-972f7c531998_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.493 225706 INFO nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Deleting local config drive /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998/disk.config because it was imported into RBD.#033[00m
Jan 23 05:24:12 np0005593295 kernel: tapd744a552-c7: entered promiscuous mode
Jan 23 05:24:12 np0005593295 NetworkManager[48964]: <info>  [1769163852.5628] manager: (tapd744a552-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.563 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 ovn_controller[132789]: 2026-01-23T10:24:12Z|00046|binding|INFO|Claiming lport d744a552-c706-444a-8a15-4a98c41eed50 for this chassis.
Jan 23 05:24:12 np0005593295 ovn_controller[132789]: 2026-01-23T10:24:12Z|00047|binding|INFO|d744a552-c706-444a-8a15-4a98c41eed50: Claiming fa:16:3e:9f:48:6d 10.100.0.11
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.566 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.568 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.573 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.576 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 NetworkManager[48964]: <info>  [1769163852.5773] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 23 05:24:12 np0005593295 NetworkManager[48964]: <info>  [1769163852.5781] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.591 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:48:6d 10.100.0.11'], port_security=['fa:16:3e:9f:48:6d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1107750174', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bae2b00f-87e8-40b7-b7ba-972f7c531998', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1107750174', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '7', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78b908b7-6c71-4e47-8053-0540c37dfe2c, chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=d744a552-c706-444a-8a15-4a98c41eed50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:12 np0005593295 systemd-machined[194368]: New machine qemu-3-instance-00000009.
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.593 142606 INFO neutron.agent.ovn.metadata.agent [-] Port d744a552-c706-444a-8a15-4a98c41eed50 in datapath 2fb57e44-e877-47c8-860b-b36d5b5ff599 bound to our chassis#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.595 142606 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2fb57e44-e877-47c8-860b-b36d5b5ff599#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.611 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b9e347-614c-44d0-9540-c8cf52bd26fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.612 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2fb57e44-e1 in ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.614 229823 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2fb57e44-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.614 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[db786f37-7ec3-4f96-9a45-3f010b6c99ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.615 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[18f364ba-9697-4de9-8175-c3a263836bb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.632 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb09208-eb6a-4952-bf55-644d18d35648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 systemd[1]: Started Virtual Machine qemu-3-instance-00000009.
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.655 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.659 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[85ada648-006e-4e1d-a64f-582fa61f6965]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.670 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 ovn_controller[132789]: 2026-01-23T10:24:12Z|00048|binding|INFO|Setting lport d744a552-c706-444a-8a15-4a98c41eed50 ovn-installed in OVS
Jan 23 05:24:12 np0005593295 ovn_controller[132789]: 2026-01-23T10:24:12Z|00049|binding|INFO|Setting lport d744a552-c706-444a-8a15-4a98c41eed50 up in Southbound
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.675 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 systemd-udevd[233921]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:24:12 np0005593295 NetworkManager[48964]: <info>  [1769163852.6907] device (tapd744a552-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:24:12 np0005593295 NetworkManager[48964]: <info>  [1769163852.6916] device (tapd744a552-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.703 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[01e8b13d-d55b-49ca-8cb6-5f2f9ad00381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.708 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc47972-a098-43f2-9886-efb7d877928c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 NetworkManager[48964]: <info>  [1769163852.7095] manager: (tap2fb57e44-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.737 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[69b093fb-5a3a-4129-9834-947851c8c9ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.741 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[e9663c79-b5ee-4670-b198-1c0db84fb438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 NetworkManager[48964]: <info>  [1769163852.7659] device (tap2fb57e44-e0): carrier: link connected
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.773 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[86ea1a4c-7154-4ee6-bb06-fe9b4123c12a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.788 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[8c6d3ce8-dcb1-4aac-bf91-a4fd2f1cd7ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fb57e44-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:4a:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500541, 'reachable_time': 34219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233950, 'error': None, 'target': 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.801 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[e216782d-3020-435b-b450-2b4d7707babe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:4a5f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500541, 'tstamp': 500541}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233951, 'error': None, 'target': 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.815 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1a9a51-85e9-4df8-9504-5ab4df753d13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fb57e44-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:4a:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500541, 'reachable_time': 34219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233952, 'error': None, 'target': 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.844 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[fab67375-3e5f-4126-9b0e-7f1a65fce899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:12.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.895 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[17a318de-2193-403e-baef-3b46cd957e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.896 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fb57e44-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.896 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.897 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fb57e44-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.898 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 NetworkManager[48964]: <info>  [1769163852.8992] manager: (tap2fb57e44-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 23 05:24:12 np0005593295 kernel: tap2fb57e44-e0: entered promiscuous mode
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.900 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.903 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2fb57e44-e0, col_values=(('external_ids', {'iface-id': '77b74dfc-4c39-4ac5-b1a3-1aa2c0b19a29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.904 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 ovn_controller[132789]: 2026-01-23T10:24:12Z|00050|binding|INFO|Releasing lport 77b74dfc-4c39-4ac5-b1a3-1aa2c0b19a29 from this chassis (sb_readonly=0)
Jan 23 05:24:12 np0005593295 nova_compute[225701]: 2026-01-23 10:24:12.925 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.926 142606 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2fb57e44-e877-47c8-860b-b36d5b5ff599.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2fb57e44-e877-47c8-860b-b36d5b5ff599.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.927 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[023e5ff4-2601-4184-bba4-50106786a81b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.928 142606 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: global
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    log         /dev/log local0 debug
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    log-tag     haproxy-metadata-proxy-2fb57e44-e877-47c8-860b-b36d5b5ff599
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    user        root
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    group       root
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    maxconn     1024
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    pidfile     /var/lib/neutron/external/pids/2fb57e44-e877-47c8-860b-b36d5b5ff599.pid.haproxy
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    daemon
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: defaults
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    log global
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    mode http
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    option httplog
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    option dontlognull
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    option http-server-close
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    option forwardfor
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    retries                 3
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    timeout http-request    30s
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    timeout connect         30s
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    timeout client          32s
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    timeout server          32s
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    timeout http-keep-alive 30s
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: listen listener
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    bind 169.254.169.254:80
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]:    http-request add-header X-OVN-Network-ID 2fb57e44-e877-47c8-860b-b36d5b5ff599
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:24:12 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:12.929 142606 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'env', 'PROCESS_TAG=haproxy-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2fb57e44-e877-47c8-860b-b36d5b5ff599.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:24:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:13.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:13 np0005593295 podman[233992]: 2026-01-23 10:24:13.333115177 +0000 UTC m=+0.057164604 container create c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:24:13 np0005593295 systemd[1]: Started libpod-conmon-c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220.scope.
Jan 23 05:24:13 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:24:13 np0005593295 podman[233992]: 2026-01-23 10:24:13.300469625 +0000 UTC m=+0.024519072 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:24:13 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71ae759b058fe5e6fdd63c6c93ecfa743881dbbef1d016ad8bb3ef3af3839996/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:24:13 np0005593295 podman[233992]: 2026-01-23 10:24:13.418499484 +0000 UTC m=+0.142548931 container init c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:24:13 np0005593295 podman[233992]: 2026-01-23 10:24:13.425209097 +0000 UTC m=+0.149258514 container start c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:24:13 np0005593295 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [NOTICE]   (234045) : New worker (234048) forked
Jan 23 05:24:13 np0005593295 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [NOTICE]   (234045) : Loading success.
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.477 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163853.4773085, bae2b00f-87e8-40b7-b7ba-972f7c531998 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.478 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] VM Started (Lifecycle Event)#033[00m
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.645 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.649 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163853.4806626, bae2b00f-87e8-40b7-b7ba-972f7c531998 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.650 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.683 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.686 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.706 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:24:13 np0005593295 nova_compute[225701]: 2026-01-23 10:24:13.750 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.844 225706 DEBUG nova.compute.manager [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.844 225706 DEBUG oslo_concurrency.lockutils [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.845 225706 DEBUG oslo_concurrency.lockutils [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.845 225706 DEBUG oslo_concurrency.lockutils [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.846 225706 DEBUG nova.compute.manager [req-56f7c1ca-5909-4eda-b941-4cff376bcf52 req-bf6de78b-18ae-48f8-8864-1658c328d7c4 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Processing event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.847 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:24:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:14.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.852 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.852 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163854.8508878, bae2b00f-87e8-40b7-b7ba-972f7c531998 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.853 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.857 225706 INFO nova.virt.libvirt.driver [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance spawned successfully.#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.857 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.882 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.887 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.887 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.888 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.888 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.888 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.889 225706 DEBUG nova.virt.libvirt.driver [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:24:14 np0005593295 nova_compute[225701]: 2026-01-23 10:24:14.893 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:24:15 np0005593295 nova_compute[225701]: 2026-01-23 10:24:15.013 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:24:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:15 np0005593295 nova_compute[225701]: 2026-01-23 10:24:15.104 225706 INFO nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Took 8.67 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:24:15 np0005593295 nova_compute[225701]: 2026-01-23 10:24:15.104 225706 DEBUG nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:15.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:15 np0005593295 nova_compute[225701]: 2026-01-23 10:24:15.412 225706 INFO nova.compute.manager [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Took 9.97 seconds to build instance.#033[00m
Jan 23 05:24:15 np0005593295 nova_compute[225701]: 2026-01-23 10:24:15.564 225706 DEBUG oslo_concurrency.lockutils [None req-ab0ba3fc-cc72-4922-bfc9-d9b0d7302c63 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102416 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:24:16 np0005593295 nova_compute[225701]: 2026-01-23 10:24:16.654 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:16.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:17 np0005593295 nova_compute[225701]: 2026-01-23 10:24:17.046 225706 DEBUG nova.compute.manager [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:17 np0005593295 nova_compute[225701]: 2026-01-23 10:24:17.046 225706 DEBUG oslo_concurrency.lockutils [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:17 np0005593295 nova_compute[225701]: 2026-01-23 10:24:17.046 225706 DEBUG oslo_concurrency.lockutils [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:17 np0005593295 nova_compute[225701]: 2026-01-23 10:24:17.046 225706 DEBUG oslo_concurrency.lockutils [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:17 np0005593295 nova_compute[225701]: 2026-01-23 10:24:17.047 225706 DEBUG nova.compute.manager [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] No waiting events found dispatching network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:17 np0005593295 nova_compute[225701]: 2026-01-23 10:24:17.047 225706 WARNING nova.compute.manager [req-6d096059-5dc7-48de-854f-74ff390715b3 req-acd88a38-7752-42f5-8299-812eba4a9519 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received unexpected event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:24:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:24:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:17.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:24:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.580 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.580 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.581 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.581 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.581 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.582 225706 INFO nova.compute.manager [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Terminating instance#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.583 225706 DEBUG nova.compute.manager [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:24:18 np0005593295 kernel: tapd744a552-c7 (unregistering): left promiscuous mode
Jan 23 05:24:18 np0005593295 NetworkManager[48964]: <info>  [1769163858.6362] device (tapd744a552-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.644 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593295 ovn_controller[132789]: 2026-01-23T10:24:18Z|00051|binding|INFO|Releasing lport d744a552-c706-444a-8a15-4a98c41eed50 from this chassis (sb_readonly=0)
Jan 23 05:24:18 np0005593295 ovn_controller[132789]: 2026-01-23T10:24:18Z|00052|binding|INFO|Setting lport d744a552-c706-444a-8a15-4a98c41eed50 down in Southbound
Jan 23 05:24:18 np0005593295 ovn_controller[132789]: 2026-01-23T10:24:18Z|00053|binding|INFO|Removing iface tapd744a552-c7 ovn-installed in OVS
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.648 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.662 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593295 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Deactivated successfully.
Jan 23 05:24:18 np0005593295 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Consumed 4.624s CPU time.
Jan 23 05:24:18 np0005593295 systemd-machined[194368]: Machine qemu-3-instance-00000009 terminated.
Jan 23 05:24:18 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.707 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:48:6d 10.100.0.11'], port_security=['fa:16:3e:9f:48:6d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1107750174', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bae2b00f-87e8-40b7-b7ba-972f7c531998', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1107750174', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '9', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78b908b7-6c71-4e47-8053-0540c37dfe2c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=d744a552-c706-444a-8a15-4a98c41eed50) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:18 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.709 142606 INFO neutron.agent.ovn.metadata.agent [-] Port d744a552-c706-444a-8a15-4a98c41eed50 in datapath 2fb57e44-e877-47c8-860b-b36d5b5ff599 unbound from our chassis#033[00m
Jan 23 05:24:18 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.710 142606 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2fb57e44-e877-47c8-860b-b36d5b5ff599, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:24:18 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.711 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[71387733-adef-4668-8ef5-678a917bac0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:18.712 142606 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 namespace which is not needed anymore#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.799 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.820 225706 INFO nova.virt.libvirt.driver [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Instance destroyed successfully.#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.821 225706 DEBUG nova.objects.instance [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid bae2b00f-87e8-40b7-b7ba-972f7c531998 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:18 np0005593295 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [NOTICE]   (234045) : haproxy version is 2.8.14-c23fe91
Jan 23 05:24:18 np0005593295 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [NOTICE]   (234045) : path to executable is /usr/sbin/haproxy
Jan 23 05:24:18 np0005593295 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [WARNING]  (234045) : Exiting Master process...
Jan 23 05:24:18 np0005593295 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [ALERT]    (234045) : Current worker (234048) exited with code 143 (Terminated)
Jan 23 05:24:18 np0005593295 neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599[234036]: [WARNING]  (234045) : All workers exited. Exiting... (0)
Jan 23 05:24:18 np0005593295 systemd[1]: libpod-c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220.scope: Deactivated successfully.
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.856 225706 DEBUG nova.virt.libvirt.vif [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:24:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1790835147',display_name='tempest-TestNetworkBasicOps-server-1790835147',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1790835147',id=9,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLwh/ci1qy20vB5FyaBepDv6KpYIxs8h6oo7gGlHu7RZtK7kr5mjuHzqdrX+yDa6v1DJrzMXWjaBuQGyTdeFGY8MLFkkRTd0XB8VJHoKHx7kcuI7EyiJu2dhMv2/NI1ZTg==',key_name='tempest-TestNetworkBasicOps-520442326',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:24:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-ox8kizyd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:24:15Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=bae2b00f-87e8-40b7-b7ba-972f7c531998,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.857 225706 DEBUG nova.network.os_vif_util [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d744a552-c706-444a-8a15-4a98c41eed50", "address": "fa:16:3e:9f:48:6d", "network": {"id": "2fb57e44-e877-47c8-860b-b36d5b5ff599", "bridge": "br-int", "label": "tempest-network-smoke--2143346610", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd744a552-c7", "ovs_interfaceid": "d744a552-c706-444a-8a15-4a98c41eed50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.857 225706 DEBUG nova.network.os_vif_util [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.858 225706 DEBUG os_vif [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:24:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:18.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.861 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.861 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd744a552-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:18 np0005593295 podman[234091]: 2026-01-23 10:24:18.862210762 +0000 UTC m=+0.048291327 container died c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.863 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.864 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593295 nova_compute[225701]: 2026-01-23 10:24:18.868 225706 INFO os_vif [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:48:6d,bridge_name='br-int',has_traffic_filtering=True,id=d744a552-c706-444a-8a15-4a98c41eed50,network=Network(2fb57e44-e877-47c8-860b-b36d5b5ff599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd744a552-c7')#033[00m
Jan 23 05:24:18 np0005593295 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220-userdata-shm.mount: Deactivated successfully.
Jan 23 05:24:18 np0005593295 systemd[1]: var-lib-containers-storage-overlay-71ae759b058fe5e6fdd63c6c93ecfa743881dbbef1d016ad8bb3ef3af3839996-merged.mount: Deactivated successfully.
Jan 23 05:24:18 np0005593295 podman[234091]: 2026-01-23 10:24:18.947082585 +0000 UTC m=+0.133163150 container cleanup c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:24:18 np0005593295 systemd[1]: libpod-conmon-c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220.scope: Deactivated successfully.
Jan 23 05:24:19 np0005593295 podman[234141]: 2026-01-23 10:24:19.012682066 +0000 UTC m=+0.043535000 container remove c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.019 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[9716fd1d-57c3-4543-9909-eb8fd1a7b564]: (4, ('Fri Jan 23 10:24:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 (c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220)\nc73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220\nFri Jan 23 10:24:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 (c73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220)\nc73ffcaa627253a3c8ab2d3f18209c27d64bdadb476eca3b8bd206852e828220\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.021 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5be78b-4cde-4965-911a-4c0c5c422c32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.022 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fb57e44-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:19 np0005593295 nova_compute[225701]: 2026-01-23 10:24:19.024 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593295 kernel: tap2fb57e44-e0: left promiscuous mode
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.029 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[768fad61-53b1-44b4-8dd6-49ed11bc3804]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593295 nova_compute[225701]: 2026-01-23 10:24:19.039 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.043 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[0ccb772a-4118-4689-98bc-8fa9cb9239e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.044 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[5615ba58-9798-4f09-b99f-9f661215b127]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.059 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[41ce2301-6cc8-4c56-95e2-6bc22183b4fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500534, 'reachable_time': 41585, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234159, 'error': None, 'target': 'ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.062 142723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2fb57e44-e877-47c8-860b-b36d5b5ff599 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:24:19 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:19.062 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0bf27f-cc4f-4223-bdf6-454a8449ed03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593295 systemd[1]: run-netns-ovnmeta\x2d2fb57e44\x2de877\x2d47c8\x2d860b\x2db36d5b5ff599.mount: Deactivated successfully.
Jan 23 05:24:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:19.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:19 np0005593295 nova_compute[225701]: 2026-01-23 10:24:19.388 225706 DEBUG nova.compute.manager [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-unplugged-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:19 np0005593295 nova_compute[225701]: 2026-01-23 10:24:19.389 225706 DEBUG oslo_concurrency.lockutils [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:19 np0005593295 nova_compute[225701]: 2026-01-23 10:24:19.389 225706 DEBUG oslo_concurrency.lockutils [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:19 np0005593295 nova_compute[225701]: 2026-01-23 10:24:19.389 225706 DEBUG oslo_concurrency.lockutils [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:19 np0005593295 nova_compute[225701]: 2026-01-23 10:24:19.390 225706 DEBUG nova.compute.manager [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] No waiting events found dispatching network-vif-unplugged-d744a552-c706-444a-8a15-4a98c41eed50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:19 np0005593295 nova_compute[225701]: 2026-01-23 10:24:19.390 225706 DEBUG nova.compute.manager [req-88918fd3-b92e-4432-b441-38574c7989a0 req-c15eeadd-fd02-4d32-9f7f-1d4e7b98e5db 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-unplugged-d744a552-c706-444a-8a15-4a98c41eed50 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:24:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:20 np0005593295 nova_compute[225701]: 2026-01-23 10:24:20.312 225706 INFO nova.virt.libvirt.driver [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Deleting instance files /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998_del#033[00m
Jan 23 05:24:20 np0005593295 nova_compute[225701]: 2026-01-23 10:24:20.313 225706 INFO nova.virt.libvirt.driver [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Deletion of /var/lib/nova/instances/bae2b00f-87e8-40b7-b7ba-972f7c531998_del complete#033[00m
Jan 23 05:24:20 np0005593295 nova_compute[225701]: 2026-01-23 10:24:20.367 225706 INFO nova.compute.manager [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Took 1.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:24:20 np0005593295 nova_compute[225701]: 2026-01-23 10:24:20.368 225706 DEBUG oslo.service.loopingcall [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:24:20 np0005593295 nova_compute[225701]: 2026-01-23 10:24:20.368 225706 DEBUG nova.compute.manager [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:24:20 np0005593295 nova_compute[225701]: 2026-01-23 10:24:20.368 225706 DEBUG nova.network.neutron [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:24:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:20.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:21.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:21 np0005593295 nova_compute[225701]: 2026-01-23 10:24:21.498 225706 DEBUG nova.compute.manager [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:21 np0005593295 nova_compute[225701]: 2026-01-23 10:24:21.498 225706 DEBUG oslo_concurrency.lockutils [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:21 np0005593295 nova_compute[225701]: 2026-01-23 10:24:21.498 225706 DEBUG oslo_concurrency.lockutils [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:21 np0005593295 nova_compute[225701]: 2026-01-23 10:24:21.499 225706 DEBUG oslo_concurrency.lockutils [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:21 np0005593295 nova_compute[225701]: 2026-01-23 10:24:21.499 225706 DEBUG nova.compute.manager [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] No waiting events found dispatching network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:21 np0005593295 nova_compute[225701]: 2026-01-23 10:24:21.499 225706 WARNING nova.compute.manager [req-6e74aab1-c147-449c-8cad-4251555d7c2e req-863ed47c-ccea-48b1-a841-af70aafd0712 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Received unexpected event network-vif-plugged-d744a552-c706-444a-8a15-4a98c41eed50 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:24:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.295 225706 DEBUG nova.network.neutron [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:22 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Scheduled restart job, restart counter is at 14.
Jan 23 05:24:22 np0005593295 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:24:22 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Consumed 1.599s CPU time.
Jan 23 05:24:22 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Start request repeated too quickly.
Jan 23 05:24:22 np0005593295 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.1.0.compute-2.tykohi.service: Failed with result 'exit-code'.
Jan 23 05:24:22 np0005593295 systemd[1]: Failed to start Ceph nfs.cephfs.1.0.compute-2.tykohi for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.315 225706 INFO nova.compute.manager [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Took 1.95 seconds to deallocate network for instance.#033[00m
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.376 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.377 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.427 225706 DEBUG oslo_concurrency.processutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:22.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:24:22 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2724804691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.907 225706 DEBUG oslo_concurrency.processutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.914 225706 DEBUG nova.compute.provider_tree [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.936 225706 DEBUG nova.scheduler.client.report [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.971 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:22 np0005593295 nova_compute[225701]: 2026-01-23 10:24:22.994 225706 INFO nova.scheduler.client.report [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance bae2b00f-87e8-40b7-b7ba-972f7c531998#033[00m
Jan 23 05:24:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:23 np0005593295 nova_compute[225701]: 2026-01-23 10:24:23.105 225706 DEBUG oslo_concurrency.lockutils [None req-a8809a06-4f5c-4f0e-8ea3-0d9c5c4d7581 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "bae2b00f-87e8-40b7-b7ba-972f7c531998" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:23.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:23 np0005593295 podman[234188]: 2026-01-23 10:24:23.626042709 +0000 UTC m=+0.051406723 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 05:24:23 np0005593295 podman[234187]: 2026-01-23 10:24:23.650965901 +0000 UTC m=+0.078419566 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 23 05:24:23 np0005593295 nova_compute[225701]: 2026-01-23 10:24:23.810 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:23 np0005593295 nova_compute[225701]: 2026-01-23 10:24:23.863 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:24.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:25.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:26.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:27.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102428 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:24:28 np0005593295 nova_compute[225701]: 2026-01-23 10:24:28.813 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:28 np0005593295 nova_compute[225701]: 2026-01-23 10:24:28.865 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:28.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:29.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:30 np0005593295 nova_compute[225701]: 2026-01-23 10:24:30.567 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593295 nova_compute[225701]: 2026-01-23 10:24:30.641 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:24:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:30.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:24:30 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:30.923 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:30 np0005593295 nova_compute[225701]: 2026-01-23 10:24:30.924 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:30.924 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:24:30 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:30.925 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:31.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:32.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:33.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:33 np0005593295 nova_compute[225701]: 2026-01-23 10:24:33.816 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:33 np0005593295 nova_compute[225701]: 2026-01-23 10:24:33.818 225706 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163858.8164032, bae2b00f-87e8-40b7-b7ba-972f7c531998 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:33 np0005593295 nova_compute[225701]: 2026-01-23 10:24:33.818 225706 INFO nova.compute.manager [-] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:24:33 np0005593295 nova_compute[225701]: 2026-01-23 10:24:33.836 225706 DEBUG nova.compute.manager [None req-7596056e-82ba-4d7d-9765-c4ac4d5de086 - - - - - -] [instance: bae2b00f-87e8-40b7-b7ba-972f7c531998] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:33 np0005593295 nova_compute[225701]: 2026-01-23 10:24:33.867 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:34.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:35.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:36.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:37.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:38 np0005593295 nova_compute[225701]: 2026-01-23 10:24:38.868 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:38 np0005593295 nova_compute[225701]: 2026-01-23 10:24:38.869 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:38 np0005593295 nova_compute[225701]: 2026-01-23 10:24:38.870 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:24:38 np0005593295 nova_compute[225701]: 2026-01-23 10:24:38.870 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:24:38 np0005593295 nova_compute[225701]: 2026-01-23 10:24:38.871 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:38 np0005593295 nova_compute[225701]: 2026-01-23 10:24:38.872 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:24:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:38.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:39.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:40.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:41.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:42.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:43.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:43 np0005593295 nova_compute[225701]: 2026-01-23 10:24:43.872 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:43 np0005593295 nova_compute[225701]: 2026-01-23 10:24:43.874 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:44.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:45.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:24:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:46.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:24:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:47.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:48 np0005593295 nova_compute[225701]: 2026-01-23 10:24:48.873 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:48 np0005593295 nova_compute[225701]: 2026-01-23 10:24:48.875 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:48.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:49.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:50.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:51.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:52.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:53.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:53 np0005593295 nova_compute[225701]: 2026-01-23 10:24:53.876 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:53 np0005593295 nova_compute[225701]: 2026-01-23 10:24:53.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:54 np0005593295 podman[234314]: 2026-01-23 10:24:54.63048386 +0000 UTC m=+0.044041143 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 05:24:54 np0005593295 podman[234313]: 2026-01-23 10:24:54.651400654 +0000 UTC m=+0.074066020 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:24:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:54.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:55.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:55.492 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:55.493 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:24:55.493 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:56 np0005593295 podman[234482]: 2026-01-23 10:24:56.287892271 +0000 UTC m=+0.196817133 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:24:56 np0005593295 podman[234482]: 2026-01-23 10:24:56.421091631 +0000 UTC m=+0.330016473 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 05:24:56 np0005593295 podman[234586]: 2026-01-23 10:24:56.789415804 +0000 UTC m=+0.060335222 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:24:56 np0005593295 podman[234586]: 2026-01-23 10:24:56.808175164 +0000 UTC m=+0.079094582 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:24:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:56.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:24:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:57 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:24:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:57.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:57 np0005593295 podman[234742]: 2026-01-23 10:24:57.442432236 +0000 UTC m=+0.052678794 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 05:24:57 np0005593295 podman[234742]: 2026-01-23 10:24:57.48210197 +0000 UTC m=+0.092348538 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 05:24:57 np0005593295 podman[234809]: 2026-01-23 10:24:57.675748844 +0000 UTC m=+0.047900627 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.buildah.version=1.28.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.openshift.tags=Ceph keepalived)
Jan 23 05:24:57 np0005593295 podman[234809]: 2026-01-23 10:24:57.689972843 +0000 UTC m=+0.062124616 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, distribution-scope=public)
Jan 23 05:24:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:58 np0005593295 nova_compute[225701]: 2026-01-23 10:24:58.877 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:58 np0005593295 nova_compute[225701]: 2026-01-23 10:24:58.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:58 np0005593295 nova_compute[225701]: 2026-01-23 10:24:58.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:24:58 np0005593295 nova_compute[225701]: 2026-01-23 10:24:58.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:24:58 np0005593295 nova_compute[225701]: 2026-01-23 10:24:58.878 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:24:58 np0005593295 nova_compute[225701]: 2026-01-23 10:24:58.879 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:58.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:59 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:59 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:24:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:24:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:24:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:24:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:59.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:24:59 np0005593295 podman[235121]: 2026-01-23 10:24:59.55207572 +0000 UTC m=+0.052044759 container create 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Jan 23 05:24:59 np0005593295 systemd[1]: Started libpod-conmon-462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b.scope.
Jan 23 05:24:59 np0005593295 podman[235121]: 2026-01-23 10:24:59.525587249 +0000 UTC m=+0.025556378 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:24:59 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:24:59 np0005593295 podman[235121]: 2026-01-23 10:24:59.643629947 +0000 UTC m=+0.143599016 container init 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Jan 23 05:24:59 np0005593295 podman[235121]: 2026-01-23 10:24:59.655001066 +0000 UTC m=+0.154970105 container start 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:24:59 np0005593295 podman[235121]: 2026-01-23 10:24:59.65883328 +0000 UTC m=+0.158802319 container attach 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:24:59 np0005593295 awesome_jennings[235137]: 167 167
Jan 23 05:24:59 np0005593295 systemd[1]: libpod-462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b.scope: Deactivated successfully.
Jan 23 05:24:59 np0005593295 podman[235142]: 2026-01-23 10:24:59.710347015 +0000 UTC m=+0.029140326 container died 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:24:59 np0005593295 systemd[1]: var-lib-containers-storage-overlay-98f0838e9607886778be7dd3570121eb5eb3d92285ec27d408631f743dc5edab-merged.mount: Deactivated successfully.
Jan 23 05:24:59 np0005593295 podman[235142]: 2026-01-23 10:24:59.749845275 +0000 UTC m=+0.068638566 container remove 462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_jennings, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 05:24:59 np0005593295 systemd[1]: libpod-conmon-462223d4168db8a0e9758ca7f3e67b11ab117734f21672b5b83632902c29952b.scope: Deactivated successfully.
Jan 23 05:24:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:24:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:24:59 np0005593295 podman[235164]: 2026-01-23 10:24:59.980987589 +0000 UTC m=+0.050397898 container create 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 23 05:25:00 np0005593295 systemd[1]: Started libpod-conmon-6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951.scope.
Jan 23 05:25:00 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:25:00 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 05:25:00 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 05:25:00 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:25:00 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 05:25:00 np0005593295 podman[235164]: 2026-01-23 10:24:59.96105417 +0000 UTC m=+0.030464509 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 05:25:00 np0005593295 podman[235164]: 2026-01-23 10:25:00.056111054 +0000 UTC m=+0.125521363 container init 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 23 05:25:00 np0005593295 podman[235164]: 2026-01-23 10:25:00.06447986 +0000 UTC m=+0.133890149 container start 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Jan 23 05:25:00 np0005593295 podman[235164]: 2026-01-23 10:25:00.067707809 +0000 UTC m=+0.137118108 container attach 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Jan 23 05:25:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:00 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:00 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]: [
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:    {
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "available": false,
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "being_replaced": false,
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "ceph_device_lvm": false,
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "lsm_data": {},
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "lvs": [],
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "path": "/dev/sr0",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "rejected_reasons": [
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "Insufficient space (<5GB)",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "Has a FileSystem"
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        ],
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        "sys_api": {
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "actuators": null,
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "device_nodes": [
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:                "sr0"
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            ],
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "devname": "sr0",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "human_readable_size": "482.00 KB",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "id_bus": "ata",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "model": "QEMU DVD-ROM",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "nr_requests": "2",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "parent": "/dev/sr0",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "partitions": {},
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "path": "/dev/sr0",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "removable": "1",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "rev": "2.5+",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "ro": "0",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "rotational": "1",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "sas_address": "",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "sas_device_handle": "",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "scheduler_mode": "mq-deadline",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "sectors": 0,
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "sectorsize": "2048",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "size": 493568.0,
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "support_discard": "2048",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "type": "disk",
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:            "vendor": "QEMU"
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:        }
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]:    }
Jan 23 05:25:00 np0005593295 condescending_chaum[235180]: ]
Jan 23 05:25:00 np0005593295 systemd[1]: libpod-6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951.scope: Deactivated successfully.
Jan 23 05:25:00 np0005593295 podman[235164]: 2026-01-23 10:25:00.833466619 +0000 UTC m=+0.902876928 container died 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:25:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:00 np0005593295 systemd[1]: var-lib-containers-storage-overlay-a252f223d368a15287f2111f680585f7a104f5d6a0ccded61b3835a4f8c43a6d-merged.mount: Deactivated successfully.
Jan 23 05:25:00 np0005593295 podman[235164]: 2026-01-23 10:25:00.880011711 +0000 UTC m=+0.949422000 container remove 6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_chaum, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 05:25:00 np0005593295 systemd[1]: libpod-conmon-6ba9173e524f454feb97d3433d77b9d34e5b728a93342094e581e918794e2951.scope: Deactivated successfully.
Jan 23 05:25:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:00.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:01.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:25:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:25:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:25:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:02 np0005593295 ceph-mon[75771]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 05:25:02 np0005593295 nova_compute[225701]: 2026-01-23 10:25:02.780 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:02.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:03.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:25:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.880 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.881 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.881 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.882 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.882 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:25:03 np0005593295 nova_compute[225701]: 2026-01-23 10:25:03.883 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:04 np0005593295 nova_compute[225701]: 2026-01-23 10:25:04.218 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:25:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:04.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:05.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:06 np0005593295 nova_compute[225701]: 2026-01-23 10:25:06.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:25:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:06.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:25:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:07.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:08 np0005593295 nova_compute[225701]: 2026-01-23 10:25:08.780 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:08 np0005593295 nova_compute[225701]: 2026-01-23 10:25:08.928 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:08 np0005593295 nova_compute[225701]: 2026-01-23 10:25:08.929 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:08 np0005593295 nova_compute[225701]: 2026-01-23 10:25:08.929 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:25:08 np0005593295 nova_compute[225701]: 2026-01-23 10:25:08.929 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:25:08 np0005593295 nova_compute[225701]: 2026-01-23 10:25:08.930 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:25:08 np0005593295 nova_compute[225701]: 2026-01-23 10:25:08.931 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:08.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:09.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:09 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:09 np0005593295 nova_compute[225701]: 2026-01-23 10:25:09.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:09 np0005593295 nova_compute[225701]: 2026-01-23 10:25:09.946 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:09 np0005593295 nova_compute[225701]: 2026-01-23 10:25:09.947 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:09 np0005593295 nova_compute[225701]: 2026-01-23 10:25:09.947 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:09 np0005593295 nova_compute[225701]: 2026-01-23 10:25:09.947 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:25:09 np0005593295 nova_compute[225701]: 2026-01-23 10:25:09.948 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:25:10 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1091720736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:10 np0005593295 nova_compute[225701]: 2026-01-23 10:25:10.452 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:10 np0005593295 nova_compute[225701]: 2026-01-23 10:25:10.634 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:25:10 np0005593295 nova_compute[225701]: 2026-01-23 10:25:10.635 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4880MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:25:10 np0005593295 nova_compute[225701]: 2026-01-23 10:25:10.635 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:10 np0005593295 nova_compute[225701]: 2026-01-23 10:25:10.636 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:10 np0005593295 nova_compute[225701]: 2026-01-23 10:25:10.774 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:25:10 np0005593295 nova_compute[225701]: 2026-01-23 10:25:10.774 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:25:10 np0005593295 nova_compute[225701]: 2026-01-23 10:25:10.792 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:10 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:10 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:25:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:10.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:25:11 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1971763186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:11.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:11 np0005593295 nova_compute[225701]: 2026-01-23 10:25:11.282 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:11 np0005593295 nova_compute[225701]: 2026-01-23 10:25:11.288 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:11 np0005593295 nova_compute[225701]: 2026-01-23 10:25:11.301 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:11 np0005593295 nova_compute[225701]: 2026-01-23 10:25:11.322 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:25:11 np0005593295 nova_compute[225701]: 2026-01-23 10:25:11.322 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:12 np0005593295 nova_compute[225701]: 2026-01-23 10:25:12.323 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:12 np0005593295 nova_compute[225701]: 2026-01-23 10:25:12.323 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:12 np0005593295 nova_compute[225701]: 2026-01-23 10:25:12.324 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:12 np0005593295 nova_compute[225701]: 2026-01-23 10:25:12.324 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:12 np0005593295 nova_compute[225701]: 2026-01-23 10:25:12.324 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:12 np0005593295 nova_compute[225701]: 2026-01-23 10:25:12.324 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:25:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:12.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.017547) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913017765, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1267, "num_deletes": 251, "total_data_size": 3164328, "memory_usage": 3227232, "flush_reason": "Manual Compaction"}
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913030854, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2012832, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29978, "largest_seqno": 31239, "table_properties": {"data_size": 2007176, "index_size": 2987, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12560, "raw_average_key_size": 20, "raw_value_size": 1995757, "raw_average_value_size": 3234, "num_data_blocks": 128, "num_entries": 617, "num_filter_entries": 617, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163822, "oldest_key_time": 1769163822, "file_creation_time": 1769163913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 13331 microseconds, and 5871 cpu microseconds.
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.030930) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2012832 bytes OK
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.030958) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032879) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032898) EVENT_LOG_v1 {"time_micros": 1769163913032894, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032919) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3158232, prev total WAL file size 3158232, number of live WAL files 2.
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.033992) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1965KB)], [57(12MB)]
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913034349, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 14774089, "oldest_snapshot_seqno": -1}
Jan 23 05:25:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5879 keys, 12616502 bytes, temperature: kUnknown
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913164851, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12616502, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12578658, "index_size": 22054, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14725, "raw_key_size": 152049, "raw_average_key_size": 25, "raw_value_size": 12473824, "raw_average_value_size": 2121, "num_data_blocks": 881, "num_entries": 5879, "num_filter_entries": 5879, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769163913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.165139) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12616502 bytes
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.167531) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 113.1 rd, 96.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.2 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(13.6) write-amplify(6.3) OK, records in: 6400, records dropped: 521 output_compression: NoCompression
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.167547) EVENT_LOG_v1 {"time_micros": 1769163913167540, "job": 34, "event": "compaction_finished", "compaction_time_micros": 130591, "compaction_time_cpu_micros": 55216, "output_level": 6, "num_output_files": 1, "total_output_size": 12616502, "num_input_records": 6400, "num_output_records": 5879, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913168099, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913170252, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.033742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170310) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:25:13.170320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:25:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:13.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:13 np0005593295 ovn_controller[132789]: 2026-01-23T10:25:13Z|00054|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 05:25:13 np0005593295 nova_compute[225701]: 2026-01-23 10:25:13.931 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:14.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:15.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:16 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:16.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:17.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:18.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:18 np0005593295 nova_compute[225701]: 2026-01-23 10:25:18.975 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:18 np0005593295 nova_compute[225701]: 2026-01-23 10:25:18.976 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:18 np0005593295 nova_compute[225701]: 2026-01-23 10:25:18.976 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:25:18 np0005593295 nova_compute[225701]: 2026-01-23 10:25:18.976 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:25:18 np0005593295 nova_compute[225701]: 2026-01-23 10:25:18.977 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:25:18 np0005593295 nova_compute[225701]: 2026-01-23 10:25:18.977 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:19.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:20.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:25:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:21.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:25:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:22.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:23.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:23 np0005593295 nova_compute[225701]: 2026-01-23 10:25:23.979 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:24.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:25.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:25 np0005593295 podman[236451]: 2026-01-23 10:25:25.666941337 +0000 UTC m=+0.080109449 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:25:25 np0005593295 podman[236448]: 2026-01-23 10:25:25.690793162 +0000 UTC m=+0.101726368 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 05:25:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:26.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:27.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:28.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:28 np0005593295 nova_compute[225701]: 2026-01-23 10:25:28.980 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:29.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:30.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:31.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:32 np0005593295 nova_compute[225701]: 2026-01-23 10:25:32.040 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:32 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:32.041 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:32 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:32.044 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:25:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:32.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:33.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:33 np0005593295 nova_compute[225701]: 2026-01-23 10:25:33.981 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:34.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:35.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:36 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:36.046 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:36.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:37.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:38 np0005593295 nova_compute[225701]: 2026-01-23 10:25:38.983 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:38.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:39.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:40.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:41.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:42.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:43.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:43 np0005593295 nova_compute[225701]: 2026-01-23 10:25:43.984 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:43 np0005593295 nova_compute[225701]: 2026-01-23 10:25:43.986 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:44.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:45.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:46.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [WARNING] 022/102547 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 05:25:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj[83790]: [ALERT] 022/102547 (4) : backend 'backend' has no server available!
Jan 23 05:25:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:48 np0005593295 nova_compute[225701]: 2026-01-23 10:25:48.986 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:49.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:49.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:51.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:51.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:51 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:53.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:53.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.521 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.521 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.543 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.612 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.613 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.619 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.620 225706 INFO nova.compute.claims [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.717 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:53 np0005593295 nova_compute[225701]: 2026-01-23 10:25:53.989 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:25:54 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4284066152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.220 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.225 225706 DEBUG nova.compute.provider_tree [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.239 225706 DEBUG nova.scheduler.client.report [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.266 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.267 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.322 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.322 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.348 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.365 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.455 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.457 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.457 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Creating image(s)#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.491 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.521 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.546 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.549 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.606 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.608 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.608 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.609 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.637 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.641 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.804 225706 DEBUG nova.policy [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:25:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:54 np0005593295 nova_compute[225701]: 2026-01-23 10:25:54.944 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.303s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:55 np0005593295 nova_compute[225701]: 2026-01-23 10:25:55.009 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:25:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:55.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:55 np0005593295 nova_compute[225701]: 2026-01-23 10:25:55.119 225706 DEBUG nova.objects.instance [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:55 np0005593295 nova_compute[225701]: 2026-01-23 10:25:55.138 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:25:55 np0005593295 nova_compute[225701]: 2026-01-23 10:25:55.139 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Ensure instance console log exists: /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:25:55 np0005593295 nova_compute[225701]: 2026-01-23 10:25:55.139 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:55 np0005593295 nova_compute[225701]: 2026-01-23 10:25:55.139 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:55 np0005593295 nova_compute[225701]: 2026-01-23 10:25:55.140 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:55.493 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:55.494 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:55.494 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:55 np0005593295 nova_compute[225701]: 2026-01-23 10:25:55.936 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Successfully created port: 2611e513-4316-4421-8b89-1c0f37157967 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:25:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:56 np0005593295 podman[236742]: 2026-01-23 10:25:56.636810727 +0000 UTC m=+0.055503634 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 05:25:56 np0005593295 podman[236741]: 2026-01-23 10:25:56.662853216 +0000 UTC m=+0.083897571 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 05:25:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:56 np0005593295 nova_compute[225701]: 2026-01-23 10:25:56.915 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Successfully updated port: 2611e513-4316-4421-8b89-1c0f37157967 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:25:56 np0005593295 nova_compute[225701]: 2026-01-23 10:25:56.930 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:56 np0005593295 nova_compute[225701]: 2026-01-23 10:25:56.930 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:56 np0005593295 nova_compute[225701]: 2026-01-23 10:25:56.930 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:25:56 np0005593295 nova_compute[225701]: 2026-01-23 10:25:56.990 225706 DEBUG nova.compute.manager [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:56 np0005593295 nova_compute[225701]: 2026-01-23 10:25:56.990 225706 DEBUG nova.compute.manager [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:25:56 np0005593295 nova_compute[225701]: 2026-01-23 10:25:56.991 225706 DEBUG oslo_concurrency.lockutils [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:57.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:57 np0005593295 nova_compute[225701]: 2026-01-23 10:25:57.042 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:25:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:57 np0005593295 nova_compute[225701]: 2026-01-23 10:25:57.982 225706 DEBUG nova.network.neutron [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.000 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.001 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance network_info: |[{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.001 225706 DEBUG oslo_concurrency.lockutils [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.001 225706 DEBUG nova.network.neutron [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.003 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start _get_guest_xml network_info=[{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.007 225706 WARNING nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.015 225706 DEBUG nova.virt.libvirt.host [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.016 225706 DEBUG nova.virt.libvirt.host [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.021 225706 DEBUG nova.virt.libvirt.host [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.022 225706 DEBUG nova.virt.libvirt.host [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.022 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.022 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.023 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.023 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.023 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.024 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.024 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.024 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.024 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.025 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.025 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.025 225706 DEBUG nova.virt.hardware [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.028 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:25:58 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1766427025' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.498 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.524 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.528 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 05:25:58 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2805498511' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:25:58 np0005593295 nova_compute[225701]: 2026-01-23 10:25:58.990 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:59.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.020 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.021 225706 DEBUG nova.virt.libvirt.vif [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1129329512',display_name='tempest-TestNetworkBasicOps-server-1129329512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1129329512',id=11,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOLllCuGpYDHB8HQl4gVCADogEY6z7uz5xBJbTjU7iL3TTWWE5uwU0nWT40qz7D0IhyDFXlwX4fWDCogYSyOPhCdGvOGsxFut3XTWNKcRsbqCULLjO4VMFh09pWX8E0IA==',key_name='tempest-TestNetworkBasicOps-1378329290',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-6r33a8b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:25:54Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.021 225706 DEBUG nova.network.os_vif_util [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.022 225706 DEBUG nova.network.os_vif_util [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.023 225706 DEBUG nova.objects.instance [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.042 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <uuid>65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad</uuid>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <name>instance-0000000b</name>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <memory>131072</memory>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <vcpu>1</vcpu>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <metadata>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <nova:name>tempest-TestNetworkBasicOps-server-1129329512</nova:name>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <nova:creationTime>2026-01-23 10:25:58</nova:creationTime>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <nova:flavor name="m1.nano">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <nova:memory>128</nova:memory>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <nova:disk>1</nova:disk>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <nova:swap>0</nova:swap>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      </nova:flavor>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <nova:owner>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      </nova:owner>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <nova:ports>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <nova:port uuid="2611e513-4316-4421-8b89-1c0f37157967">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        </nova:port>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      </nova:ports>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </nova:instance>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  </metadata>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <sysinfo type="smbios">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <system>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <entry name="serial">65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad</entry>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <entry name="uuid">65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad</entry>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </system>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  </sysinfo>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <os>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <boot dev="hd"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <smbios mode="sysinfo"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  </os>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <features>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <acpi/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <apic/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <vmcoreinfo/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  </features>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <clock offset="utc">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <timer name="hpet" present="no"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  </clock>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <cpu mode="host-model" match="exact">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  </cpu>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  <devices>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <disk type="network" device="disk">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <driver type="raw" cache="none"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <source protocol="rbd" name="vms/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      </source>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <auth username="openstack">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      </auth>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <target dev="vda" bus="virtio"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <disk type="network" device="cdrom">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <driver type="raw" cache="none"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <source protocol="rbd" name="vms/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      </source>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <auth username="openstack">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:        <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      </auth>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <target dev="sda" bus="sata"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </disk>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <interface type="ethernet">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <mac address="fa:16:3e:58:a8:f2"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <model type="virtio"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <mtu size="1442"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <target dev="tap2611e513-43"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </interface>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <serial type="pty">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <log file="/var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/console.log" append="off"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </serial>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <video>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <model type="virtio"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </video>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <input type="tablet" bus="usb"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <rng model="virtio">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </rng>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <controller type="usb" index="0"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    <memballoon model="virtio">
Jan 23 05:25:59 np0005593295 nova_compute[225701]:      <stats period="10"/>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:    </memballoon>
Jan 23 05:25:59 np0005593295 nova_compute[225701]:  </devices>
Jan 23 05:25:59 np0005593295 nova_compute[225701]: </domain>
Jan 23 05:25:59 np0005593295 nova_compute[225701]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.042 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Preparing to wait for external event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.043 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.043 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.043 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.044 225706 DEBUG nova.virt.libvirt.vif [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1129329512',display_name='tempest-TestNetworkBasicOps-server-1129329512',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1129329512',id=11,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOLllCuGpYDHB8HQl4gVCADogEY6z7uz5xBJbTjU7iL3TTWWE5uwU0nWT40qz7D0IhyDFXlwX4fWDCogYSyOPhCdGvOGsxFut3XTWNKcRsbqCULLjO4VMFh09pWX8E0IA==',key_name='tempest-TestNetworkBasicOps-1378329290',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-6r33a8b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:25:54Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.045 225706 DEBUG nova.network.os_vif_util [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.046 225706 DEBUG nova.network.os_vif_util [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.046 225706 DEBUG os_vif [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.047 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.047 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.048 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.053 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.054 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2611e513-43, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.055 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2611e513-43, col_values=(('external_ids', {'iface-id': '2611e513-4316-4421-8b89-1c0f37157967', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:a8:f2', 'vm-uuid': '65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.056 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 NetworkManager[48964]: <info>  [1769163959.0570] manager: (tap2611e513-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.059 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.062 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.063 225706 INFO os_vif [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43')#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.116 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:25:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:25:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.117 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.117 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:58:a8:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.118 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Using config drive#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.139 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:25:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:25:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:59.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.449 225706 DEBUG nova.network.neutron [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.450 225706 DEBUG nova.network.neutron [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.473 225706 DEBUG oslo_concurrency.lockutils [req-37a62512-3741-49c5-ab99-8ec2a66776e8 req-57b12e5a-c6be-4e5f-a9cd-33a092da7297 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.492 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Creating config drive at /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.496 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczvl3v5l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.619 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczvl3v5l" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.646 225706 DEBUG nova.storage.rbd_utils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.650 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.823 225706 DEBUG oslo_concurrency.processutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.824 225706 INFO nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Deleting local config drive /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad/disk.config because it was imported into RBD.#033[00m
Jan 23 05:25:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:25:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:25:59 np0005593295 kernel: tap2611e513-43: entered promiscuous mode
Jan 23 05:25:59 np0005593295 NetworkManager[48964]: <info>  [1769163959.8805] manager: (tap2611e513-43): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 23 05:25:59 np0005593295 ovn_controller[132789]: 2026-01-23T10:25:59Z|00055|binding|INFO|Claiming lport 2611e513-4316-4421-8b89-1c0f37157967 for this chassis.
Jan 23 05:25:59 np0005593295 ovn_controller[132789]: 2026-01-23T10:25:59Z|00056|binding|INFO|2611e513-4316-4421-8b89-1c0f37157967: Claiming fa:16:3e:58:a8:f2 10.100.0.13
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.880 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.884 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.894 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.905 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:a8:f2 10.100.0.13'], port_security=['fa:16:3e:58:a8:f2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-712c0ef6-fbbe-4577-b44d-9610116b414a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1d01fb50-5068-4dfb-b608-e6e67ad89b2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3547f5ca-ca7c-4ba0-a5f8-3ad2055eb8ec, chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=2611e513-4316-4421-8b89-1c0f37157967) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.907 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 2611e513-4316-4421-8b89-1c0f37157967 in datapath 712c0ef6-fbbe-4577-b44d-9610116b414a bound to our chassis#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.910 142606 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 712c0ef6-fbbe-4577-b44d-9610116b414a#033[00m
Jan 23 05:25:59 np0005593295 systemd-machined[194368]: New machine qemu-4-instance-0000000b.
Jan 23 05:25:59 np0005593295 systemd-udevd[236921]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.926 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[3580955f-fda1-42b7-ae7c-ef57513e90f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.928 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap712c0ef6-f1 in ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.929 229823 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap712c0ef6-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.929 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[21a603aa-e13f-4389-9141-d50f7a9d132f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.931 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[584e043c-cd28-4763-909b-908eca1c5eb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:59 np0005593295 NetworkManager[48964]: <info>  [1769163959.9330] device (tap2611e513-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:25:59 np0005593295 NetworkManager[48964]: <info>  [1769163959.9336] device (tap2611e513-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:25:59 np0005593295 systemd[1]: Started Virtual Machine qemu-4-instance-0000000b.
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.946 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[52df5c38-8da3-4a11-9cb3-efc6c6a9c483]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.956 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 ovn_controller[132789]: 2026-01-23T10:25:59Z|00057|binding|INFO|Setting lport 2611e513-4316-4421-8b89-1c0f37157967 ovn-installed in OVS
Jan 23 05:25:59 np0005593295 ovn_controller[132789]: 2026-01-23T10:25:59Z|00058|binding|INFO|Setting lport 2611e513-4316-4421-8b89-1c0f37157967 up in Southbound
Jan 23 05:25:59 np0005593295 nova_compute[225701]: 2026-01-23 10:25:59.961 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.962 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7f976c9b-4c43-4aeb-979d-4c4d562de40f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.992 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[ba86b8e1-0766-478c-bc3a-62962b434aa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:59 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:25:59.997 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7c691b1c-5287-451a-9aa8-e3e2eff4ab53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:59 np0005593295 NetworkManager[48964]: <info>  [1769163959.9988] manager: (tap712c0ef6-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.027 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[f6fcbb37-0223-41be-b3d8-3be3c015460a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.030 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[321a0823-dec4-474d-976d-23b2df9669cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 NetworkManager[48964]: <info>  [1769163960.0486] device (tap712c0ef6-f0): carrier: link connected
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.052 229840 DEBUG oslo.privsep.daemon [-] privsep: reply[c6c43b41-439f-41d2-bc23-a5011eed7c06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.069 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[1f271144-a9b9-4476-ac5e-12a5b9c5acc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap712c0ef6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:ec:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511269, 'reachable_time': 34347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236955, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.083 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[565ec319-9745-4372-83a6-7ea0ca9846ea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:ec06'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511269, 'tstamp': 511269}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236956, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.098 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[40f81d93-cc84-45ae-9b96-e2ac4552aff9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap712c0ef6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:ec:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511269, 'reachable_time': 34347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236957, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.127 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[622dd496-0dc3-49bc-9abb-f7fea9f9f3df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.180 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[ad97e76a-27f8-4608-a4d0-1ac5f76f37ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.182 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712c0ef6-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.182 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.183 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap712c0ef6-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:00 np0005593295 NetworkManager[48964]: <info>  [1769163960.1852] manager: (tap712c0ef6-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 23 05:26:00 np0005593295 kernel: tap712c0ef6-f0: entered promiscuous mode
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.185 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.188 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap712c0ef6-f0, col_values=(('external_ids', {'iface-id': '6c333384-cae4-4f40-8b56-257e8d961c46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:00 np0005593295 ovn_controller[132789]: 2026-01-23T10:26:00Z|00059|binding|INFO|Releasing lport 6c333384-cae4-4f40-8b56-257e8d961c46 from this chassis (sb_readonly=0)
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.190 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.191 142606 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.192 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[a348a34b-c9f1-43b4-883d-131cf3efb915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.193 142606 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: global
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    log         /dev/log local0 debug
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    log-tag     haproxy-metadata-proxy-712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    user        root
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    group       root
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    maxconn     1024
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    pidfile     /var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    daemon
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: defaults
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    log global
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    mode http
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    option httplog
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    option dontlognull
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    option http-server-close
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    option forwardfor
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    retries                 3
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    timeout http-request    30s
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    timeout connect         30s
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    timeout client          32s
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    timeout server          32s
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    timeout http-keep-alive 30s
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: listen listener
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    bind 169.254.169.254:80
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]:    http-request add-header X-OVN-Network-ID 712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:26:00 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:00.194 142606 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'env', 'PROCESS_TAG=haproxy-712c0ef6-fbbe-4577-b44d-9610116b414a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/712c0ef6-fbbe-4577-b44d-9610116b414a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.204 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.525 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163960.5252264, 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.526 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] VM Started (Lifecycle Event)#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.544 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.547 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163960.5264301, 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.547 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:26:00 np0005593295 podman[237034]: 2026-01-23 10:26:00.549276532 +0000 UTC m=+0.051905426 container create 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.565 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.570 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.590 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:00 np0005593295 systemd[1]: Started libpod-conmon-46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3.scope.
Jan 23 05:26:00 np0005593295 podman[237034]: 2026-01-23 10:26:00.521598282 +0000 UTC m=+0.024227196 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 05:26:00 np0005593295 systemd[1]: Started libcrun container.
Jan 23 05:26:00 np0005593295 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/455af21eacddd3d5239de182b4e4b79fd4186593d5fa50aaea2fa48c1d2e0bce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:26:00 np0005593295 podman[237034]: 2026-01-23 10:26:00.641279041 +0000 UTC m=+0.143907965 container init 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 05:26:00 np0005593295 podman[237034]: 2026-01-23 10:26:00.647097833 +0000 UTC m=+0.149726717 container start 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:26:00 np0005593295 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [NOTICE]   (237053) : New worker (237055) forked
Jan 23 05:26:00 np0005593295 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [NOTICE]   (237053) : Loading success.
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.671 225706 DEBUG nova.compute.manager [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.672 225706 DEBUG oslo_concurrency.lockutils [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.672 225706 DEBUG oslo_concurrency.lockutils [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.673 225706 DEBUG oslo_concurrency.lockutils [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.673 225706 DEBUG nova.compute.manager [req-4509fc4f-a764-4505-9342-0e88d88fd085 req-c9013843-cfb4-4797-aaad-d1cbef9bc734 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Processing event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.674 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.678 225706 DEBUG nova.virt.driver [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] Emitting event <LifecycleEvent: 1769163960.6787412, 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.679 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.681 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.684 225706 INFO nova.virt.libvirt.driver [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance spawned successfully.#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.685 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.702 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.709 225706 DEBUG nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.712 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.713 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.714 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.714 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.715 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.715 225706 DEBUG nova.virt.libvirt.driver [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.741 225706 INFO nova.compute.manager [None req-a0c13c3c-5009-45ca-afed-fe3a66c71fb5 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.782 225706 INFO nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Took 6.33 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.783 225706 DEBUG nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.845 225706 INFO nova.compute.manager [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Took 7.26 seconds to build instance.#033[00m
Jan 23 05:26:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:00 np0005593295 nova_compute[225701]: 2026-01-23 10:26:00.865 225706 DEBUG oslo_concurrency.lockutils [None req-54aaa6b2-3080-41e4-8ddf-99a975fc0c89 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:01.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:26:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:01.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:26:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.727 225706 DEBUG nova.compute.manager [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.728 225706 DEBUG oslo_concurrency.lockutils [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.728 225706 DEBUG oslo_concurrency.lockutils [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.729 225706 DEBUG oslo_concurrency.lockutils [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.729 225706 DEBUG nova.compute.manager [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.729 225706 WARNING nova.compute.manager [req-636eccae-31a8-4d72-8c60-5a750ae84797 req-d049e572-b532-49c6-b7c2-e511370e4243 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:02 np0005593295 nova_compute[225701]: 2026-01-23 10:26:02.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:26:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:03.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:03.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:03 np0005593295 nova_compute[225701]: 2026-01-23 10:26:03.992 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:04 np0005593295 nova_compute[225701]: 2026-01-23 10:26:04.057 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:05.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:05.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:05 np0005593295 nova_compute[225701]: 2026-01-23 10:26:05.797 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:05 np0005593295 nova_compute[225701]: 2026-01-23 10:26:05.798 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:26:05 np0005593295 nova_compute[225701]: 2026-01-23 10:26:05.798 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:26:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:06 np0005593295 nova_compute[225701]: 2026-01-23 10:26:06.801 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:06 np0005593295 nova_compute[225701]: 2026-01-23 10:26:06.801 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:06 np0005593295 nova_compute[225701]: 2026-01-23 10:26:06.801 225706 DEBUG nova.network.neutron [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:26:06 np0005593295 nova_compute[225701]: 2026-01-23 10:26:06.802 225706 DEBUG nova.objects.instance [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:07.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:07.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:07 np0005593295 nova_compute[225701]: 2026-01-23 10:26:07.740 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:07 np0005593295 NetworkManager[48964]: <info>  [1769163967.7443] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 23 05:26:07 np0005593295 NetworkManager[48964]: <info>  [1769163967.7453] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 23 05:26:07 np0005593295 ovn_controller[132789]: 2026-01-23T10:26:07Z|00060|binding|INFO|Releasing lport 6c333384-cae4-4f40-8b56-257e8d961c46 from this chassis (sb_readonly=0)
Jan 23 05:26:07 np0005593295 nova_compute[225701]: 2026-01-23 10:26:07.769 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:07 np0005593295 ovn_controller[132789]: 2026-01-23T10:26:07Z|00061|binding|INFO|Releasing lport 6c333384-cae4-4f40-8b56-257e8d961c46 from this chassis (sb_readonly=0)
Jan 23 05:26:07 np0005593295 nova_compute[225701]: 2026-01-23 10:26:07.774 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:08 np0005593295 nova_compute[225701]: 2026-01-23 10:26:08.996 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:09.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.051 225706 DEBUG nova.compute.manager [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.052 225706 DEBUG nova.compute.manager [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.052 225706 DEBUG oslo_concurrency.lockutils [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.058 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:09.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.562 225706 DEBUG nova.network.neutron [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.591 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.591 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.592 225706 DEBUG oslo_concurrency.lockutils [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.592 225706 DEBUG nova.network.neutron [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.593 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.788 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.809 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.811 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.811 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.811 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:26:09 np0005593295 nova_compute[225701]: 2026-01-23 10:26:09.812 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:26:10 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1268700137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.289 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.437 225706 DEBUG nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.438 225706 DEBUG nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.607 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.608 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4731MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.609 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.609 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.725 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Instance 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.725 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.725 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.791 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.811 225706 DEBUG nova.network.neutron [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.812 225706 DEBUG nova.network.neutron [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:10 np0005593295 nova_compute[225701]: 2026-01-23 10:26:10.826 225706 DEBUG oslo_concurrency.lockutils [req-ae6960ed-8206-492c-8101-44d3a35d3066 req-4bb0b829-14d2-4eeb-90e9-3b2626ab5e4b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:11.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:26:11 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/311252267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:11 np0005593295 nova_compute[225701]: 2026-01-23 10:26:11.246 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:11 np0005593295 nova_compute[225701]: 2026-01-23 10:26:11.252 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:11 np0005593295 nova_compute[225701]: 2026-01-23 10:26:11.267 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:11 np0005593295 nova_compute[225701]: 2026-01-23 10:26:11.306 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:26:11 np0005593295 nova_compute[225701]: 2026-01-23 10:26:11.306 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:11.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:13.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:13.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:13 np0005593295 nova_compute[225701]: 2026-01-23 10:26:13.998 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:14 np0005593295 nova_compute[225701]: 2026-01-23 10:26:14.060 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:14 np0005593295 nova_compute[225701]: 2026-01-23 10:26:14.302 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:14 np0005593295 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:14 np0005593295 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:14 np0005593295 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:14 np0005593295 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:14 np0005593295 nova_compute[225701]: 2026-01-23 10:26:14.303 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:26:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:14 np0005593295 ovn_controller[132789]: 2026-01-23T10:26:14Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:a8:f2 10.100.0.13
Jan 23 05:26:14 np0005593295 ovn_controller[132789]: 2026-01-23T10:26:14Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:a8:f2 10.100.0.13
Jan 23 05:26:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:15.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:15.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:15 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:15 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:15 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:26:15 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:16 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:16 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:16 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:26:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:17.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:17.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:19 np0005593295 nova_compute[225701]: 2026-01-23 10:26:19.000 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:26:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:19.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:26:19 np0005593295 nova_compute[225701]: 2026-01-23 10:26:19.061 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:19.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:20 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:20 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:26:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:21.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:21.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:21 np0005593295 nova_compute[225701]: 2026-01-23 10:26:21.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:23.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:23.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:24 np0005593295 nova_compute[225701]: 2026-01-23 10:26:24.002 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:24 np0005593295 nova_compute[225701]: 2026-01-23 10:26:24.062 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:24 np0005593295 nova_compute[225701]: 2026-01-23 10:26:24.803 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:24 np0005593295 nova_compute[225701]: 2026-01-23 10:26:24.803 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:26:24 np0005593295 nova_compute[225701]: 2026-01-23 10:26:24.820 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:26:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:25.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:25.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:27.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:27 np0005593295 podman[237295]: 2026-01-23 10:26:27.634602064 +0000 UTC m=+0.054721955 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:26:27 np0005593295 podman[237294]: 2026-01-23 10:26:27.664635581 +0000 UTC m=+0.084562767 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:26:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:29 np0005593295 nova_compute[225701]: 2026-01-23 10:26:29.004 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:29 np0005593295 nova_compute[225701]: 2026-01-23 10:26:29.064 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:29.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:29.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:31.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:31.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:33.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:33.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:34 np0005593295 nova_compute[225701]: 2026-01-23 10:26:34.005 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:34 np0005593295 nova_compute[225701]: 2026-01-23 10:26:34.066 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:35.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:35.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:37.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:37.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:39 np0005593295 nova_compute[225701]: 2026-01-23 10:26:39.009 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:39 np0005593295 nova_compute[225701]: 2026-01-23 10:26:39.067 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:39.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:39.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:41.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:41.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:43.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:43.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:44 np0005593295 nova_compute[225701]: 2026-01-23 10:26:44.012 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:44 np0005593295 nova_compute[225701]: 2026-01-23 10:26:44.068 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:44 np0005593295 nova_compute[225701]: 2026-01-23 10:26:44.949 225706 INFO nova.compute.manager [None req-20215e91-4413-4a39-aa3e-9cf7fd1b6aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Get console output#033[00m
Jan 23 05:26:44 np0005593295 nova_compute[225701]: 2026-01-23 10:26:44.958 225706 INFO oslo.privsep.daemon [None req-20215e91-4413-4a39-aa3e-9cf7fd1b6aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpm1dfunpz/privsep.sock']#033[00m
Jan 23 05:26:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:45.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:45.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:45 np0005593295 nova_compute[225701]: 2026-01-23 10:26:45.721 225706 INFO oslo.privsep.daemon [None req-20215e91-4413-4a39-aa3e-9cf7fd1b6aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 05:26:45 np0005593295 nova_compute[225701]: 2026-01-23 10:26:45.601 237361 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:26:45 np0005593295 nova_compute[225701]: 2026-01-23 10:26:45.608 237361 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:26:45 np0005593295 nova_compute[225701]: 2026-01-23 10:26:45.612 237361 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 23 05:26:45 np0005593295 nova_compute[225701]: 2026-01-23 10:26:45.613 237361 INFO oslo.privsep.daemon [-] privsep daemon running as pid 237361#033[00m
Jan 23 05:26:45 np0005593295 nova_compute[225701]: 2026-01-23 10:26:45.817 237361 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:26:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:47 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:46.999 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:47 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:47.000 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.001 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:47.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.135 225706 DEBUG nova.compute.manager [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.136 225706 DEBUG nova.compute.manager [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.136 225706 DEBUG oslo_concurrency.lockutils [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.137 225706 DEBUG oslo_concurrency.lockutils [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.137 225706 DEBUG nova.network.neutron [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.172 225706 DEBUG nova.compute.manager [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.173 225706 DEBUG oslo_concurrency.lockutils [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.174 225706 DEBUG oslo_concurrency.lockutils [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.174 225706 DEBUG oslo_concurrency.lockutils [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.174 225706 DEBUG nova.compute.manager [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:47 np0005593295 nova_compute[225701]: 2026-01-23 10:26:47.175 225706 WARNING nova.compute.manager [req-6ccae6f0-7623-4035-a9eb-51b29165d9a0 req-c44a50bc-40ef-415a-aa20-2daf76f3a4cb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:47.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:48 np0005593295 nova_compute[225701]: 2026-01-23 10:26:48.163 225706 INFO nova.compute.manager [None req-e98b0cfc-3a11-43be-ab3f-f7a210467de3 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Get console output#033[00m
Jan 23 05:26:48 np0005593295 nova_compute[225701]: 2026-01-23 10:26:48.170 237361 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:26:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.014 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.069 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:49.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:49.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.590 225706 DEBUG nova.compute.manager [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.591 225706 DEBUG oslo_concurrency.lockutils [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.591 225706 DEBUG oslo_concurrency.lockutils [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.591 225706 DEBUG oslo_concurrency.lockutils [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.592 225706 DEBUG nova.compute.manager [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.592 225706 WARNING nova.compute.manager [req-ff21e5ca-3cb7-4ac8-98a1-ec15be70ee71 req-729e187b-e506-468f-bbc4-6ad9221eee13 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.718 225706 DEBUG nova.network.neutron [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.719 225706 DEBUG nova.network.neutron [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:49 np0005593295 nova_compute[225701]: 2026-01-23 10:26:49.958 225706 DEBUG oslo_concurrency.lockutils [req-c628b317-9e70-4683-a16f-849b7adbce6c req-6cf54849-65db-409d-bde2-2ef17ae8f864 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:51 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:51.002 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.090 225706 DEBUG nova.compute.manager [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.091 225706 DEBUG nova.compute.manager [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.091 225706 DEBUG oslo_concurrency.lockutils [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.091 225706 DEBUG oslo_concurrency.lockutils [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.091 225706 DEBUG nova.network.neutron [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:51.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.101 225706 INFO nova.compute.manager [None req-588c94de-092b-4d0f-8588-d7670846f3f7 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Get console output#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.105 237361 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:26:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:51 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:51.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.699 225706 DEBUG nova.compute.manager [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.700 225706 DEBUG oslo_concurrency.lockutils [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.700 225706 DEBUG oslo_concurrency.lockutils [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.701 225706 DEBUG oslo_concurrency.lockutils [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.701 225706 DEBUG nova.compute.manager [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:51 np0005593295 nova_compute[225701]: 2026-01-23 10:26:51.702 225706 WARNING nova.compute.manager [req-2bb8696d-7016-4e61-96a9-a82bc05e8e60 req-f29f1513-d6db-4258-b0a1-5b08975efa17 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:26:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:53.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:26:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.319 225706 DEBUG nova.network.neutron [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.319 225706 DEBUG nova.network.neutron [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:53.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.384 225706 DEBUG oslo_concurrency.lockutils [req-1e2a0c27-9ed7-4ce9-9771-49e476588684 req-33df6016-117e-47bf-a5bd-f84e7ec168c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.988 225706 DEBUG nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.988 225706 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.990 225706 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.990 225706 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.991 225706 DEBUG nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:53 np0005593295 nova_compute[225701]: 2026-01-23 10:26:53.991 225706 WARNING nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:54 np0005593295 nova_compute[225701]: 2026-01-23 10:26:54.016 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593295 nova_compute[225701]: 2026-01-23 10:26:54.070 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:55.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:55.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:55.494 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:55.495 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:26:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:57.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:26:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:26:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:58 np0005593295 podman[237403]: 2026-01-23 10:26:58.632667957 +0000 UTC m=+0.052243994 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:26:58 np0005593295 podman[237402]: 2026-01-23 10:26:58.68779159 +0000 UTC m=+0.101725898 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:26:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:59 np0005593295 nova_compute[225701]: 2026-01-23 10:26:59.017 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:59 np0005593295 nova_compute[225701]: 2026-01-23 10:26:59.072 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:59.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:26:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:26:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:26:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:26:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:59.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:26:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:26:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:01.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:01.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:03.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:03.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:03 np0005593295 nova_compute[225701]: 2026-01-23 10:27:03.796 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:04 np0005593295 nova_compute[225701]: 2026-01-23 10:27:04.019 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:04 np0005593295 nova_compute[225701]: 2026-01-23 10:27:04.072 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:05.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:05.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:05 np0005593295 nova_compute[225701]: 2026-01-23 10:27:05.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:05 np0005593295 nova_compute[225701]: 2026-01-23 10:27:05.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:27:05 np0005593295 nova_compute[225701]: 2026-01-23 10:27:05.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:27:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:06 np0005593295 nova_compute[225701]: 2026-01-23 10:27:06.793 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:06 np0005593295 nova_compute[225701]: 2026-01-23 10:27:06.793 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:06 np0005593295 nova_compute[225701]: 2026-01-23 10:27:06.794 225706 DEBUG nova.network.neutron [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:27:06 np0005593295 nova_compute[225701]: 2026-01-23 10:27:06.794 225706 DEBUG nova.objects.instance [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:07.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:27:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:07.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:27:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:09 np0005593295 nova_compute[225701]: 2026-01-23 10:27:09.036 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:09 np0005593295 nova_compute[225701]: 2026-01-23 10:27:09.074 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000048s ======
Jan 23 05:27:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:09.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Jan 23 05:27:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:09.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:09 np0005593295 nova_compute[225701]: 2026-01-23 10:27:09.593 225706 DEBUG nova.compute.manager [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-changed-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:09 np0005593295 nova_compute[225701]: 2026-01-23 10:27:09.593 225706 DEBUG nova.compute.manager [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing instance network info cache due to event network-changed-2611e513-4316-4421-8b89-1c0f37157967. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:27:09 np0005593295 nova_compute[225701]: 2026-01-23 10:27:09.593 225706 DEBUG oslo_concurrency.lockutils [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.206 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.206 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.207 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.207 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.207 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.208 225706 INFO nova.compute.manager [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Terminating instance#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.209 225706 DEBUG nova.compute.manager [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.212 225706 DEBUG nova.network.neutron [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.236 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.236 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.236 225706 DEBUG oslo_concurrency.lockutils [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.237 225706 DEBUG nova.network.neutron [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Refreshing network info cache for port 2611e513-4316-4421-8b89-1c0f37157967 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:10 np0005593295 nova_compute[225701]: 2026-01-23 10:27:10.807 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:11.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:11.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.509 225706 DEBUG nova.network.neutron [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updated VIF entry in instance network info cache for port 2611e513-4316-4421-8b89-1c0f37157967. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.509 225706 DEBUG nova.network.neutron [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [{"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.536 225706 DEBUG oslo_concurrency.lockutils [req-3f3650be-0f26-4683-9b85-8b0b85d51374 req-aabc123c-b5dd-4d4b-94cc-5989201ba0f8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:27:11 np0005593295 kernel: tap2611e513-43 (unregistering): left promiscuous mode
Jan 23 05:27:11 np0005593295 NetworkManager[48964]: <info>  [1769164031.6608] device (tap2611e513-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:27:11 np0005593295 ovn_controller[132789]: 2026-01-23T10:27:11Z|00062|binding|INFO|Releasing lport 2611e513-4316-4421-8b89-1c0f37157967 from this chassis (sb_readonly=0)
Jan 23 05:27:11 np0005593295 ovn_controller[132789]: 2026-01-23T10:27:11Z|00063|binding|INFO|Setting lport 2611e513-4316-4421-8b89-1c0f37157967 down in Southbound
Jan 23 05:27:11 np0005593295 ovn_controller[132789]: 2026-01-23T10:27:11Z|00064|binding|INFO|Removing iface tap2611e513-43 ovn-installed in OVS
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.670 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.673 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.681 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:a8:f2 10.100.0.13'], port_security=['fa:16:3e:58:a8:f2 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-712c0ef6-fbbe-4577-b44d-9610116b414a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1d01fb50-5068-4dfb-b608-e6e67ad89b2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3547f5ca-ca7c-4ba0-a5f8-3ad2055eb8ec, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>], logical_port=2611e513-4316-4421-8b89-1c0f37157967) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fefcfaf1640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:11 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.683 142606 INFO neutron.agent.ovn.metadata.agent [-] Port 2611e513-4316-4421-8b89-1c0f37157967 in datapath 712c0ef6-fbbe-4577-b44d-9610116b414a unbound from our chassis#033[00m
Jan 23 05:27:11 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.685 142606 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 712c0ef6-fbbe-4577-b44d-9610116b414a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:27:11 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.689 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[6e03dcd2-604c-4b12-91da-b67937982d00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.690 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:11.692 142606 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a namespace which is not needed anymore#033[00m
Jan 23 05:27:11 np0005593295 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 23 05:27:11 np0005593295 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Consumed 15.850s CPU time.
Jan 23 05:27:11 np0005593295 systemd-machined[194368]: Machine qemu-4-instance-0000000b terminated.
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.823 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.824 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.824 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.825 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.825 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.846 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.852 225706 INFO nova.virt.libvirt.driver [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Instance destroyed successfully.#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.852 225706 DEBUG nova.objects.instance [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.869 225706 DEBUG nova.virt.libvirt.vif [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:25:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1129329512',display_name='tempest-TestNetworkBasicOps-server-1129329512',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1129329512',id=11,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBOLllCuGpYDHB8HQl4gVCADogEY6z7uz5xBJbTjU7iL3TTWWE5uwU0nWT40qz7D0IhyDFXlwX4fWDCogYSyOPhCdGvOGsxFut3XTWNKcRsbqCULLjO4VMFh09pWX8E0IA==',key_name='tempest-TestNetworkBasicOps-1378329290',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:26:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-6r33a8b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:26:00Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.870 225706 DEBUG nova.network.os_vif_util [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "2611e513-4316-4421-8b89-1c0f37157967", "address": "fa:16:3e:58:a8:f2", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2611e513-43", "ovs_interfaceid": "2611e513-4316-4421-8b89-1c0f37157967", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.871 225706 DEBUG nova.network.os_vif_util [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.871 225706 DEBUG os_vif [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.873 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.873 225706 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2611e513-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.875 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.876 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:11 np0005593295 nova_compute[225701]: 2026-01-23 10:27:11.880 225706 INFO os_vif [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:a8:f2,bridge_name='br-int',has_traffic_filtering=True,id=2611e513-4316-4421-8b89-1c0f37157967,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2611e513-43')#033[00m
Jan 23 05:27:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:12 np0005593295 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [NOTICE]   (237053) : haproxy version is 2.8.14-c23fe91
Jan 23 05:27:12 np0005593295 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [NOTICE]   (237053) : path to executable is /usr/sbin/haproxy
Jan 23 05:27:12 np0005593295 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [WARNING]  (237053) : Exiting Master process...
Jan 23 05:27:12 np0005593295 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [ALERT]    (237053) : Current worker (237055) exited with code 143 (Terminated)
Jan 23 05:27:12 np0005593295 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237049]: [WARNING]  (237053) : All workers exited. Exiting... (0)
Jan 23 05:27:12 np0005593295 systemd[1]: libpod-46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3.scope: Deactivated successfully.
Jan 23 05:27:12 np0005593295 podman[237511]: 2026-01-23 10:27:12.627418671 +0000 UTC m=+0.818555966 container died 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:27:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:13 np0005593295 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3-userdata-shm.mount: Deactivated successfully.
Jan 23 05:27:13 np0005593295 systemd[1]: var-lib-containers-storage-overlay-455af21eacddd3d5239de182b4e4b79fd4186593d5fa50aaea2fa48c1d2e0bce-merged.mount: Deactivated successfully.
Jan 23 05:27:13 np0005593295 podman[237511]: 2026-01-23 10:27:13.076589229 +0000 UTC m=+1.267726504 container cleanup 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.082 225706 DEBUG nova.compute.manager [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.083 225706 DEBUG oslo_concurrency.lockutils [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.083 225706 DEBUG oslo_concurrency.lockutils [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.083 225706 DEBUG oslo_concurrency.lockutils [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.084 225706 DEBUG nova.compute.manager [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.084 225706 DEBUG nova.compute.manager [req-cc441573-7ed5-4ded-ae98-bdb93e89d734 req-01e39a0b-7b06-4d5d-830a-594188973c24 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-unplugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:27:13 np0005593295 systemd[1]: libpod-conmon-46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3.scope: Deactivated successfully.
Jan 23 05:27:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:13.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:13 np0005593295 podman[237588]: 2026-01-23 10:27:13.33566186 +0000 UTC m=+0.231422873 container remove 46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.343 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[5225c8b6-c06d-4eb4-b647-6d819df259a2]: (4, ('Fri Jan 23 10:27:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a (46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3)\n46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3\nFri Jan 23 10:27:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a (46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3)\n46cd76023cd6a75b288e0b3c91b63a4e6e3dd7c9b68db5c46ab0476eedea81e3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.346 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7a517479-a4c5-436a-b1e5-33258409200d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.348 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712c0ef6-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:13 np0005593295 kernel: tap712c0ef6-f0: left promiscuous mode
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.381 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.394 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.398 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[35561923-747d-4bda-ba5f-1a66b452588c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:13.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:27:13 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/506360951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.415 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[e7af4a28-f868-4347-964c-fe1866adeaec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.417 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[7048755b-7537-4c47-aebd-6f557c2b2b7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.434 229823 DEBUG oslo.privsep.daemon [-] privsep: reply[a717d8d7-ce6a-421c-9993-bd79d450c71a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511263, 'reachable_time': 42014, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237610, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.439 142723 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:27:13 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:13.439 142723 DEBUG oslo.privsep.daemon [-] privsep: reply[c992ed24-172b-42b7-9dd7-4d0f468839a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:13 np0005593295 systemd[1]: run-netns-ovnmeta\x2d712c0ef6\x2dfbbe\x2d4577\x2db44d\x2d9610116b414a.mount: Deactivated successfully.
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.442 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.500 225706 DEBUG nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.500 225706 DEBUG nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.640 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.642 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4872MB free_disk=59.94270324707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.642 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.642 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.721 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Instance 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.721 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.721 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:27:13 np0005593295 nova_compute[225701]: 2026-01-23 10:27:13.760 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.022 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:27:14 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3948964576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.294 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.300 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.318 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.345 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.346 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.350 225706 DEBUG nova.compute.manager [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.350 225706 DEBUG oslo_concurrency.lockutils [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.351 225706 DEBUG oslo_concurrency.lockutils [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.351 225706 DEBUG oslo_concurrency.lockutils [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.351 225706 DEBUG nova.compute.manager [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] No waiting events found dispatching network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:14 np0005593295 nova_compute[225701]: 2026-01-23 10:27:14.351 225706 WARNING nova.compute.manager [req-a0e421ed-fa7e-4e8a-a2e8-d1621a9a938a req-22b5bd33-f618-43b5-ba80-35856d1fdacd 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received unexpected event network-vif-plugged-2611e513-4316-4421-8b89-1c0f37157967 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:27:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:15.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:15.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:16 np0005593295 nova_compute[225701]: 2026-01-23 10:27:16.347 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:16 np0005593295 nova_compute[225701]: 2026-01-23 10:27:16.347 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:16 np0005593295 nova_compute[225701]: 2026-01-23 10:27:16.347 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:16 np0005593295 nova_compute[225701]: 2026-01-23 10:27:16.347 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:16 np0005593295 nova_compute[225701]: 2026-01-23 10:27:16.348 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:16 np0005593295 nova_compute[225701]: 2026-01-23 10:27:16.348 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:27:16 np0005593295 nova_compute[225701]: 2026-01-23 10:27:16.877 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:17.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:17.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:17 np0005593295 nova_compute[225701]: 2026-01-23 10:27:17.827 225706 INFO nova.virt.libvirt.driver [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Deleting instance files /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_del#033[00m
Jan 23 05:27:17 np0005593295 nova_compute[225701]: 2026-01-23 10:27:17.828 225706 INFO nova.virt.libvirt.driver [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Deletion of /var/lib/nova/instances/65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad_del complete#033[00m
Jan 23 05:27:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:17 np0005593295 nova_compute[225701]: 2026-01-23 10:27:17.895 225706 INFO nova.compute.manager [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Took 7.69 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:27:17 np0005593295 nova_compute[225701]: 2026-01-23 10:27:17.895 225706 DEBUG oslo.service.loopingcall [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:27:17 np0005593295 nova_compute[225701]: 2026-01-23 10:27:17.896 225706 DEBUG nova.compute.manager [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:27:17 np0005593295 nova_compute[225701]: 2026-01-23 10:27:17.896 225706 DEBUG nova.network.neutron [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:27:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:19 np0005593295 nova_compute[225701]: 2026-01-23 10:27:19.024 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:19.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:19.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:19 np0005593295 nova_compute[225701]: 2026-01-23 10:27:19.530 225706 DEBUG nova.network.neutron [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:19 np0005593295 nova_compute[225701]: 2026-01-23 10:27:19.561 225706 INFO nova.compute.manager [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Took 1.67 seconds to deallocate network for instance.#033[00m
Jan 23 05:27:19 np0005593295 nova_compute[225701]: 2026-01-23 10:27:19.627 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:19 np0005593295 nova_compute[225701]: 2026-01-23 10:27:19.628 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:19 np0005593295 nova_compute[225701]: 2026-01-23 10:27:19.674 225706 DEBUG nova.compute.manager [req-90d7f807-96f1-48fc-9d1c-5762c0e654f5 req-52dcc5ab-4ace-496c-b3f8-4a158b7bad2c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Received event network-vif-deleted-2611e513-4316-4421-8b89-1c0f37157967 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:19 np0005593295 nova_compute[225701]: 2026-01-23 10:27:19.680 225706 DEBUG oslo_concurrency.processutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:27:20 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2669587480' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:20 np0005593295 nova_compute[225701]: 2026-01-23 10:27:20.157 225706 DEBUG oslo_concurrency.processutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:20 np0005593295 nova_compute[225701]: 2026-01-23 10:27:20.163 225706 DEBUG nova.compute.provider_tree [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:27:20 np0005593295 nova_compute[225701]: 2026-01-23 10:27:20.180 225706 DEBUG nova.scheduler.client.report [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:27:20 np0005593295 nova_compute[225701]: 2026-01-23 10:27:20.319 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:20 np0005593295 nova_compute[225701]: 2026-01-23 10:27:20.341 225706 INFO nova.scheduler.client.report [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad#033[00m
Jan 23 05:27:20 np0005593295 nova_compute[225701]: 2026-01-23 10:27:20.457 225706 DEBUG oslo_concurrency.lockutils [None req-9b622b50-8b02-47ce-9ee0-86dc328d6067 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:21.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:21.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:21 np0005593295 nova_compute[225701]: 2026-01-23 10:27:21.879 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:23.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:24 np0005593295 nova_compute[225701]: 2026-01-23 10:27:24.026 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:25.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:25.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:25 np0005593295 nova_compute[225701]: 2026-01-23 10:27:25.681 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:25 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:25 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:25 np0005593295 nova_compute[225701]: 2026-01-23 10:27:25.831 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:27:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:26 np0005593295 ceph-mon[75771]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 05:27:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:26 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:27:26 np0005593295 nova_compute[225701]: 2026-01-23 10:27:26.850 225706 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164031.845321, 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:26 np0005593295 nova_compute[225701]: 2026-01-23 10:27:26.851 225706 INFO nova.compute.manager [-] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:27:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:26 np0005593295 nova_compute[225701]: 2026-01-23 10:27:26.882 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:26 np0005593295 nova_compute[225701]: 2026-01-23 10:27:26.887 225706 DEBUG nova.compute.manager [None req-1cefaecd-2424-412a-86d8-a1d9fb0d2b20 - - - - - -] [instance: 65ba3e6d-0ed7-4f3a-ad9e-e79d166a75ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:27.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:27.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:29 np0005593295 nova_compute[225701]: 2026-01-23 10:27:29.028 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:29.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:29.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:29 np0005593295 podman[237783]: 2026-01-23 10:27:29.645604919 +0000 UTC m=+0.060087268 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:27:29 np0005593295 podman[237782]: 2026-01-23 10:27:29.679825366 +0000 UTC m=+0.090173463 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller)
Jan 23 05:27:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:31.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:31.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:31 np0005593295 nova_compute[225701]: 2026-01-23 10:27:31.886 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:32 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:32 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:27:32 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:33.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:33.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:34 np0005593295 nova_compute[225701]: 2026-01-23 10:27:34.029 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:35.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:36 np0005593295 nova_compute[225701]: 2026-01-23 10:27:36.889 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:37.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:37.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:39 np0005593295 nova_compute[225701]: 2026-01-23 10:27:39.030 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:39.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:39.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:41.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:41.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:41 np0005593295 nova_compute[225701]: 2026-01-23 10:27:41.892 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:43.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:27:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:43.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:27:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:44 np0005593295 nova_compute[225701]: 2026-01-23 10:27:44.032 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:45.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:45.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:46 np0005593295 nova_compute[225701]: 2026-01-23 10:27:46.895 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:47.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:47.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:47 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:47.571 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:47 np0005593295 nova_compute[225701]: 2026-01-23 10:27:47.572 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:47 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:47.573 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:27:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:49 np0005593295 nova_compute[225701]: 2026-01-23 10:27:49.076 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:49.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:49.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:51.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:51.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:51 np0005593295 nova_compute[225701]: 2026-01-23 10:27:51.898 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:53.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000048s ======
Jan 23 05:27:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:53.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Jan 23 05:27:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:54 np0005593295 nova_compute[225701]: 2026-01-23 10:27:54.079 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:55.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:27:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:55.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:27:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:55.495 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:56 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:27:56.575 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:56 np0005593295 nova_compute[225701]: 2026-01-23 10:27:56.900 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:57.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:57.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:59 np0005593295 nova_compute[225701]: 2026-01-23 10:27:59.122 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:27:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:59.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:27:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:59.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:27:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:27:59 np0005593295 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 05:27:59 np0005593295 podman[237908]: 2026-01-23 10:27:59.956657268 +0000 UTC m=+0.046640585 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:27:59 np0005593295 podman[237907]: 2026-01-23 10:27:59.981710838 +0000 UTC m=+0.077790335 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:28:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:01.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:01.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:01 np0005593295 nova_compute[225701]: 2026-01-23 10:28:01.903 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:03.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:03.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:03 np0005593295 ceph-mds[83039]: mds.beacon.cephfs.compute-2.prgzmm missed beacon ack from the monitors
Jan 23 05:28:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:04 np0005593295 nova_compute[225701]: 2026-01-23 10:28:04.124 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:04 np0005593295 nova_compute[225701]: 2026-01-23 10:28:04.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:05.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:05.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:06 np0005593295 nova_compute[225701]: 2026-01-23 10:28:06.906 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:07.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:07 np0005593295 ovn_controller[132789]: 2026-01-23T10:28:07Z|00065|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 05:28:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:07.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:07 np0005593295 nova_compute[225701]: 2026-01-23 10:28:07.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:07 np0005593295 nova_compute[225701]: 2026-01-23 10:28:07.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:28:07 np0005593295 nova_compute[225701]: 2026-01-23 10:28:07.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:28:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:08 np0005593295 nova_compute[225701]: 2026-01-23 10:28:08.060 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:28:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:09 np0005593295 nova_compute[225701]: 2026-01-23 10:28:09.126 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:09.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:09.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:11.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:11.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:11 np0005593295 nova_compute[225701]: 2026-01-23 10:28:11.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:11 np0005593295 nova_compute[225701]: 2026-01-23 10:28:11.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:11 np0005593295 nova_compute[225701]: 2026-01-23 10:28:11.810 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:11 np0005593295 nova_compute[225701]: 2026-01-23 10:28:11.810 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:11 np0005593295 nova_compute[225701]: 2026-01-23 10:28:11.810 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:11 np0005593295 nova_compute[225701]: 2026-01-23 10:28:11.810 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:28:11 np0005593295 nova_compute[225701]: 2026-01-23 10:28:11.811 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:11 np0005593295 nova_compute[225701]: 2026-01-23 10:28:11.908 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:28:12 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/676391413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:12 np0005593295 nova_compute[225701]: 2026-01-23 10:28:12.306 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:12 np0005593295 nova_compute[225701]: 2026-01-23 10:28:12.478 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:28:12 np0005593295 nova_compute[225701]: 2026-01-23 10:28:12.479 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4879MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:28:12 np0005593295 nova_compute[225701]: 2026-01-23 10:28:12.479 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:12 np0005593295 nova_compute[225701]: 2026-01-23 10:28:12.480 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:13 np0005593295 nova_compute[225701]: 2026-01-23 10:28:13.129 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:28:13 np0005593295 nova_compute[225701]: 2026-01-23 10:28:13.130 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:28:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:13 np0005593295 nova_compute[225701]: 2026-01-23 10:28:13.189 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:28:13 np0005593295 nova_compute[225701]: 2026-01-23 10:28:13.209 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:28:13 np0005593295 nova_compute[225701]: 2026-01-23 10:28:13.210 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:28:13 np0005593295 nova_compute[225701]: 2026-01-23 10:28:13.227 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:28:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:13.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:13 np0005593295 nova_compute[225701]: 2026-01-23 10:28:13.246 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:28:13 np0005593295 nova_compute[225701]: 2026-01-23 10:28:13.269 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:13.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:14 np0005593295 nova_compute[225701]: 2026-01-23 10:28:14.129 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:15.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:15.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:16 np0005593295 nova_compute[225701]: 2026-01-23 10:28:16.910 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:17.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:17.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:28:18 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3824772010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:18 np0005593295 nova_compute[225701]: 2026-01-23 10:28:18.877 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 5.607s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:18 np0005593295 nova_compute[225701]: 2026-01-23 10:28:18.885 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:28:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:18 np0005593295 nova_compute[225701]: 2026-01-23 10:28:18.907 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:28:18 np0005593295 nova_compute[225701]: 2026-01-23 10:28:18.940 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:28:18 np0005593295 nova_compute[225701]: 2026-01-23 10:28:18.941 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:19 np0005593295 nova_compute[225701]: 2026-01-23 10:28:19.170 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:19.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:19.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:20 np0005593295 nova_compute[225701]: 2026-01-23 10:28:20.941 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:20 np0005593295 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:20 np0005593295 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:20 np0005593295 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:20 np0005593295 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:20 np0005593295 nova_compute[225701]: 2026-01-23 10:28:20.942 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:28:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:21.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:21.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:21 np0005593295 nova_compute[225701]: 2026-01-23 10:28:21.913 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:23.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:23.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:24 np0005593295 nova_compute[225701]: 2026-01-23 10:28:24.173 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:25.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:25.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:26 np0005593295 nova_compute[225701]: 2026-01-23 10:28:26.915 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:27.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:27.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:29 np0005593295 nova_compute[225701]: 2026-01-23 10:28:29.221 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:29.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:29.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:30 np0005593295 podman[238080]: 2026-01-23 10:28:30.627582459 +0000 UTC m=+0.052264125 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 23 05:28:30 np0005593295 podman[238079]: 2026-01-23 10:28:30.651822638 +0000 UTC m=+0.076956416 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:28:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:31.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:31.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:31 np0005593295 nova_compute[225701]: 2026-01-23 10:28:31.917 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:33.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:33.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:33 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:28:33 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:28:33 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:28:33 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:28:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:34 np0005593295 nova_compute[225701]: 2026-01-23 10:28:34.223 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:35.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:35.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:36 np0005593295 nova_compute[225701]: 2026-01-23 10:28:36.919 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:37.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:37.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:37 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:28:37 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:28:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:39 np0005593295 nova_compute[225701]: 2026-01-23 10:28:39.225 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:39.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:39.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:41.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:41.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:41 np0005593295 nova_compute[225701]: 2026-01-23 10:28:41.923 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:43.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:43.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:44 np0005593295 nova_compute[225701]: 2026-01-23 10:28:44.227 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:45.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:28:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:45.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:28:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:46 np0005593295 nova_compute[225701]: 2026-01-23 10:28:46.925 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:47.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:47.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 05:28:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/292376593' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:28:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 05:28:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/292376593' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:28:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:49 np0005593295 nova_compute[225701]: 2026-01-23 10:28:49.229 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:49.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:49.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:50 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:28:50.976 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:28:50 np0005593295 nova_compute[225701]: 2026-01-23 10:28:50.977 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:50 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:28:50.978 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:28:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:51.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:51 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:51.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:51 np0005593295 nova_compute[225701]: 2026-01-23 10:28:51.928 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:52 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 23 05:28:52 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:52.971077) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:28:52 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 23 05:28:52 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132971399, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2385, "num_deletes": 251, "total_data_size": 6544207, "memory_usage": 6629936, "flush_reason": "Manual Compaction"}
Jan 23 05:28:52 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 23 05:28:52 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:28:52.980 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:28:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133284449, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4222889, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31244, "largest_seqno": 33624, "table_properties": {"data_size": 4213075, "index_size": 6244, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19896, "raw_average_key_size": 20, "raw_value_size": 4193741, "raw_average_value_size": 4305, "num_data_blocks": 264, "num_entries": 974, "num_filter_entries": 974, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163913, "oldest_key_time": 1769163913, "file_creation_time": 1769164132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 313358 microseconds, and 14499 cpu microseconds.
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:28:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:53.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.284538) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4222889 bytes OK
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.284574) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.308106) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.308170) EVENT_LOG_v1 {"time_micros": 1769164133308158, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.308206) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6533777, prev total WAL file size 6533777, number of live WAL files 2.
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.310275) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4123KB)], [60(12MB)]
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133310439, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16839391, "oldest_snapshot_seqno": -1}
Jan 23 05:28:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:53.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6335 keys, 14607145 bytes, temperature: kUnknown
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133588349, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14607145, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14564574, "index_size": 25629, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15877, "raw_key_size": 162325, "raw_average_key_size": 25, "raw_value_size": 14450187, "raw_average_value_size": 2281, "num_data_blocks": 1024, "num_entries": 6335, "num_filter_entries": 6335, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.588636) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14607145 bytes
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.591525) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 60.6 rd, 52.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.0 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.4) write-amplify(3.5) OK, records in: 6853, records dropped: 518 output_compression: NoCompression
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.591550) EVENT_LOG_v1 {"time_micros": 1769164133591540, "job": 36, "event": "compaction_finished", "compaction_time_micros": 278019, "compaction_time_cpu_micros": 42860, "output_level": 6, "num_output_files": 1, "total_output_size": 14607145, "num_input_records": 6853, "num_output_records": 6335, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133592439, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164133594779, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.310150) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595028) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:53 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:28:53.595036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:28:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:54 np0005593295 nova_compute[225701]: 2026-01-23 10:28:54.231 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:55.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:28:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:28:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:28:55.496 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:55.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:56 np0005593295 nova_compute[225701]: 2026-01-23 10:28:56.930 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:57.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:57.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:28:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:28:59 np0005593295 nova_compute[225701]: 2026-01-23 10:28:59.233 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:28:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:59.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:28:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:28:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:59.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:28:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:01.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:01.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:01 np0005593295 podman[238284]: 2026-01-23 10:29:01.652272541 +0000 UTC m=+0.063476052 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:29:01 np0005593295 podman[238283]: 2026-01-23 10:29:01.712340788 +0000 UTC m=+0.126484801 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:29:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:01 np0005593295 nova_compute[225701]: 2026-01-23 10:29:01.963 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:03.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:03.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:04 np0005593295 nova_compute[225701]: 2026-01-23 10:29:04.235 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:05.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:05.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:05 np0005593295 nova_compute[225701]: 2026-01-23 10:29:05.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:06 np0005593295 nova_compute[225701]: 2026-01-23 10:29:06.999 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:07.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:07.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:07 np0005593295 nova_compute[225701]: 2026-01-23 10:29:07.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:07 np0005593295 nova_compute[225701]: 2026-01-23 10:29:07.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:29:07 np0005593295 nova_compute[225701]: 2026-01-23 10:29:07.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:29:07 np0005593295 nova_compute[225701]: 2026-01-23 10:29:07.798 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:29:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:09 np0005593295 nova_compute[225701]: 2026-01-23 10:29:09.237 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:09.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:09.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:11.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:11.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:12 np0005593295 nova_compute[225701]: 2026-01-23 10:29:12.002 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:12 np0005593295 nova_compute[225701]: 2026-01-23 10:29:12.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:12 np0005593295 nova_compute[225701]: 2026-01-23 10:29:12.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:12 np0005593295 nova_compute[225701]: 2026-01-23 10:29:12.804 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:12 np0005593295 nova_compute[225701]: 2026-01-23 10:29:12.804 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:12 np0005593295 nova_compute[225701]: 2026-01-23 10:29:12.804 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:12 np0005593295 nova_compute[225701]: 2026-01-23 10:29:12.804 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:29:12 np0005593295 nova_compute[225701]: 2026-01-23 10:29:12.805 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:13.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:29:13 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3300512477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:13 np0005593295 nova_compute[225701]: 2026-01-23 10:29:13.369 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.565s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:13.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:13 np0005593295 nova_compute[225701]: 2026-01-23 10:29:13.557 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:29:13 np0005593295 nova_compute[225701]: 2026-01-23 10:29:13.558 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4920MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:29:13 np0005593295 nova_compute[225701]: 2026-01-23 10:29:13.559 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:13 np0005593295 nova_compute[225701]: 2026-01-23 10:29:13.559 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:13 np0005593295 nova_compute[225701]: 2026-01-23 10:29:13.629 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:29:13 np0005593295 nova_compute[225701]: 2026-01-23 10:29:13.629 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:29:13 np0005593295 nova_compute[225701]: 2026-01-23 10:29:13.643 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:29:14 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/423661688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:14 np0005593295 nova_compute[225701]: 2026-01-23 10:29:14.062 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:14 np0005593295 nova_compute[225701]: 2026-01-23 10:29:14.067 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:29:14 np0005593295 nova_compute[225701]: 2026-01-23 10:29:14.112 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:29:14 np0005593295 nova_compute[225701]: 2026-01-23 10:29:14.114 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:29:14 np0005593295 nova_compute[225701]: 2026-01-23 10:29:14.114 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:14 np0005593295 nova_compute[225701]: 2026-01-23 10:29:14.239 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:15.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:15.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:16 np0005593295 nova_compute[225701]: 2026-01-23 10:29:16.114 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593295 nova_compute[225701]: 2026-01-23 10:29:16.133 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593295 nova_compute[225701]: 2026-01-23 10:29:16.133 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593295 nova_compute[225701]: 2026-01-23 10:29:16.133 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:16 np0005593295 nova_compute[225701]: 2026-01-23 10:29:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:17 np0005593295 nova_compute[225701]: 2026-01-23 10:29:17.047 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:17.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:17.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:18 np0005593295 nova_compute[225701]: 2026-01-23 10:29:18.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:18 np0005593295 nova_compute[225701]: 2026-01-23 10:29:18.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:29:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:19 np0005593295 nova_compute[225701]: 2026-01-23 10:29:19.274 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:19.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:29:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:29:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:21.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:21.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:22 np0005593295 nova_compute[225701]: 2026-01-23 10:29:22.049 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:23.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:23.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:24 np0005593295 nova_compute[225701]: 2026-01-23 10:29:24.277 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:25.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:25.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:27 np0005593295 nova_compute[225701]: 2026-01-23 10:29:27.051 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:27.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:29 np0005593295 nova_compute[225701]: 2026-01-23 10:29:29.279 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:29.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:29.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:31.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:31.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:32 np0005593295 nova_compute[225701]: 2026-01-23 10:29:32.055 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:32 np0005593295 podman[238453]: 2026-01-23 10:29:32.624503066 +0000 UTC m=+0.049549057 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:29:32 np0005593295 podman[238452]: 2026-01-23 10:29:32.663019069 +0000 UTC m=+0.089378262 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:29:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:33.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:33.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:34 np0005593295 nova_compute[225701]: 2026-01-23 10:29:34.282 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:35.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:35.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:37 np0005593295 nova_compute[225701]: 2026-01-23 10:29:37.057 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:37.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:37.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:39 np0005593295 nova_compute[225701]: 2026-01-23 10:29:39.283 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:29:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:39 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:29:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:39.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:39.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:41.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:41.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:42 np0005593295 nova_compute[225701]: 2026-01-23 10:29:42.060 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:43.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:43.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:44 np0005593295 nova_compute[225701]: 2026-01-23 10:29:44.285 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:45.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:45 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:45 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:29:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:45.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:47 np0005593295 nova_compute[225701]: 2026-01-23 10:29:47.062 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:47.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:47.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:49 np0005593295 nova_compute[225701]: 2026-01-23 10:29:49.288 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:49.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:49.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:51.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:51.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:52 np0005593295 nova_compute[225701]: 2026-01-23 10:29:52.065 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:53.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:53.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:54 np0005593295 nova_compute[225701]: 2026-01-23 10:29:54.288 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:55.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:29:55.498 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:29:55.499 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:29:55.499 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:55.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:56 np0005593295 systemd-logind[786]: New session 55 of user zuul.
Jan 23 05:29:56 np0005593295 systemd[1]: Started Session 55 of User zuul.
Jan 23 05:29:57 np0005593295 nova_compute[225701]: 2026-01-23 10:29:57.067 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:57.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:57.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:29:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:29:59 np0005593295 nova_compute[225701]: 2026-01-23 10:29:59.289 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:59.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:29:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:29:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:59.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:29:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:29:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:00 np0005593295 ceph-mon[75771]: Health detail: HEALTH_WARN 2 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Jan 23 05:30:00 np0005593295 ceph-mon[75771]: [WRN] BLUESTORE_SLOW_OP_ALERT: 2 OSD(s) experiencing slow operations in BlueStore
Jan 23 05:30:00 np0005593295 ceph-mon[75771]:     osd.1 observed slow operation indications in BlueStore
Jan 23 05:30:00 np0005593295 ceph-mon[75771]:     osd.2 observed slow operation indications in BlueStore
Jan 23 05:30:00 np0005593295 ceph-mon[75771]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Jan 23 05:30:00 np0005593295 ceph-mon[75771]:    daemon nfs.cephfs.2.0.compute-0.fenqiu on compute-0 is in error state
Jan 23 05:30:00 np0005593295 ceph-mon[75771]:    daemon nfs.cephfs.1.0.compute-2.tykohi on compute-2 is in error state
Jan 23 05:30:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:01.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 05:30:01 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2989331378' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 05:30:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:01.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:02 np0005593295 nova_compute[225701]: 2026-01-23 10:30:02.070 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:03.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:03.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:03 np0005593295 podman[238981]: 2026-01-23 10:30:03.628635623 +0000 UTC m=+0.053841355 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 05:30:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:03 np0005593295 podman[238980]: 2026-01-23 10:30:03.65846024 +0000 UTC m=+0.083711283 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:30:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:04 np0005593295 nova_compute[225701]: 2026-01-23 10:30:04.291 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:05.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:30:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:05.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:30:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:06 np0005593295 ovs-vsctl[239080]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 05:30:06 np0005593295 nova_compute[225701]: 2026-01-23 10:30:06.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:07 np0005593295 nova_compute[225701]: 2026-01-23 10:30:07.073 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:07.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:07.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:08 np0005593295 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 05:30:08 np0005593295 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 05:30:08 np0005593295 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 05:30:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:08 np0005593295 nova_compute[225701]: 2026-01-23 10:30:08.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:08 np0005593295 nova_compute[225701]: 2026-01-23 10:30:08.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:30:08 np0005593295 nova_compute[225701]: 2026-01-23 10:30:08.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:30:08 np0005593295 nova_compute[225701]: 2026-01-23 10:30:08.852 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:30:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:08 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: cache status {prefix=cache status} (starting...)
Jan 23 05:30:09 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: client ls {prefix=client ls} (starting...)
Jan 23 05:30:09 np0005593295 lvm[239453]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 05:30:09 np0005593295 lvm[239453]: VG ceph_vg0 finished
Jan 23 05:30:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:09 np0005593295 nova_compute[225701]: 2026-01-23 10:30:09.294 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:09.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:09.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:09 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 05:30:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:09 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 05:30:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 23 05:30:10 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1500381436' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 05:30:10 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 05:30:10 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 05:30:10 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 05:30:10 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 05:30:10 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 05:30:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 23 05:30:10 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/824697158' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 05:30:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:11 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 05:30:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:11 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: ops {prefix=ops} (starting...)
Jan 23 05:30:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:11.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 23 05:30:11 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3358390859' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 05:30:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:11.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 23 05:30:11 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2288603447' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 05:30:11 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:30:11 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2183856151' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:30:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:12 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: session ls {prefix=session ls} (starting...)
Jan 23 05:30:12 np0005593295 nova_compute[225701]: 2026-01-23 10:30:12.109 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:12 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: status {prefix=status} (starting...)
Jan 23 05:30:12 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 23 05:30:12 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2631441634' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 05:30:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:13.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 05:30:13 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2510791750' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 05:30:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:13.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:13 np0005593295 nova_compute[225701]: 2026-01-23 10:30:13.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 05:30:13 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1794080162' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 05:30:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:14 np0005593295 nova_compute[225701]: 2026-01-23 10:30:14.344 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 23 05:30:14 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3596812779' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 05:30:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 23 05:30:14 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1267146393' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 05:30:14 np0005593295 nova_compute[225701]: 2026-01-23 10:30:14.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:14 np0005593295 nova_compute[225701]: 2026-01-23 10:30:14.918 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:14 np0005593295 nova_compute[225701]: 2026-01-23 10:30:14.919 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:14 np0005593295 nova_compute[225701]: 2026-01-23 10:30:14.919 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:14 np0005593295 nova_compute[225701]: 2026-01-23 10:30:14.919 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:30:14 np0005593295 nova_compute[225701]: 2026-01-23 10:30:14.919 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 05:30:15 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3550185203' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 05:30:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:15.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:30:15 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/834363629' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:30:15 np0005593295 nova_compute[225701]: 2026-01-23 10:30:15.536 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:15.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:15 np0005593295 nova_compute[225701]: 2026-01-23 10:30:15.725 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:15 np0005593295 nova_compute[225701]: 2026-01-23 10:30:15.727 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4649MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:30:15 np0005593295 nova_compute[225701]: 2026-01-23 10:30:15.727 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:15 np0005593295 nova_compute[225701]: 2026-01-23 10:30:15.727 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:15 np0005593295 nova_compute[225701]: 2026-01-23 10:30:15.856 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:30:15 np0005593295 nova_compute[225701]: 2026-01-23 10:30:15.857 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:30:15 np0005593295 nova_compute[225701]: 2026-01-23 10:30:15.879 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 466944 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 458752 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 458752 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 820182 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 820182 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 401408 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 385024 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 71.207946777s of 71.231231689s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 821694 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 385024 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 385024 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 352256 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 352256 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 352256 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70926336 unmapped: 335872 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 327680 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70950912 unmapped: 311296 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70950912 unmapped: 311296 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70950912 unmapped: 311296 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 278528 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 278528 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 262144 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 262144 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 262144 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 237568 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 237568 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 221184 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 221184 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 221184 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 212992 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 212992 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 204800 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 196608 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 196608 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 172032 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 155648 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 147456 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 139264 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 139264 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 122880 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 114688 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 114688 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 106496 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 106496 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 106496 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 98304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 98304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 98304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823206 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 90112 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 81920 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 47.972595215s of 47.983440399s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 65536 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 40960 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 32768 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 826230 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 32768 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 8192 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 8192 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 8192 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 0 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 0 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 1007616 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 1007616 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 974848 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 974848 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 958464 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 925696 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 925696 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825048 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 884736 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226affc00 session 0x55922546ef00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 884736 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 43.980842590s of 44.448013306s, submitted: 4
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825969 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 860160 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825969 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 843776 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 843776 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 835584 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 835584 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 825969 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 835584 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.276217461s of 14.299237251s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828993 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 811008 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 811008 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 778240 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 778240 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 778240 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 761856 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 720896 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 720896 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 720896 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 671744 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 655360 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 655360 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread fragmentation_score=0.000021 took=0.000131s
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71671808 unmapped: 638976 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 622592 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 622592 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 614400 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 614400 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 614400 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 589824 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 589824 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 565248 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 565248 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 548864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 548864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 548864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226b07400 session 0x55922657e1e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:30:16 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 491520 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 491520 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 483328 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71835648 unmapped: 475136 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71835648 unmapped: 475136 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 466944 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 466944 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 458752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 458752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 458752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71860224 unmapped: 450560 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 442368 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 434176 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828402 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 417792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 417792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 95.813194275s of 96.227020264s, submitted: 3
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 417792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 401408 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71933952 unmapped: 376832 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71950336 unmapped: 360448 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 352256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 352256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 352256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 344064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226afe400 session 0x55922677f0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 344064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 344064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71974912 unmapped: 335872 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71974912 unmapped: 335872 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 71974912 unmapped: 335872 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5367 writes, 23K keys, 5367 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5367 writes, 783 syncs, 6.85 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5367 writes, 23K keys, 5367 commit groups, 1.0 writes per commit group, ingest: 18.76 MB, 0.03 MB/s#012Interval WAL: 5367 writes, 783 syncs, 6.85 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 9.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 270336 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 262144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 229376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827220 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 229376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 229376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 221184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 221184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.414127350s of 32.532848358s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 212992 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 212992 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 204800 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72114176 unmapped: 196608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72114176 unmapped: 196608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72114176 unmapped: 196608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 188416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 188416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 180224 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 172032 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 172032 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 155648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 155648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226af5c00 session 0x55922546e960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72171520 unmapped: 139264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.672498703s of 19.676660538s, submitted: 1
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72220672 unmapped: 1138688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72384512 unmapped: 974848 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72392704 unmapped: 966656 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 761856 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,3])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72622080 unmapped: 737280 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72622080 unmapped: 737280 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828948 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,3])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 720896 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72646656 unmapped: 712704 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 5.810975552s of 11.037171364s, submitted: 201
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 696320 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 696320 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 679936 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 679936 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72687616 unmapped: 671744 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828732 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 663552 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 663552 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72704000 unmapped: 655360 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72704000 unmapped: 655360 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 647168 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 630784 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 630784 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72744960 unmapped: 614400 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72744960 unmapped: 614400 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72744960 unmapped: 614400 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 606208 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 606208 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 598016 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 598016 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72769536 unmapped: 589824 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72769536 unmapped: 589824 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 581632 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 581632 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 581632 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 565248 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 557056 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72810496 unmapped: 548864 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72810496 unmapped: 548864 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72810496 unmapped: 548864 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 540672 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 532480 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72835072 unmapped: 524288 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72851456 unmapped: 507904 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72851456 unmapped: 507904 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 499712 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 499712 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72867840 unmapped: 491520 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72884224 unmapped: 475136 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 466944 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 466944 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 466944 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 458752 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72900608 unmapped: 458752 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 450560 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 450560 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 450560 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72916992 unmapped: 442368 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72916992 unmapped: 442368 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72941568 unmapped: 417792 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72941568 unmapped: 417792 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72949760 unmapped: 409600 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72949760 unmapped: 409600 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72949760 unmapped: 409600 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72957952 unmapped: 401408 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72957952 unmapped: 401408 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72974336 unmapped: 385024 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72974336 unmapped: 385024 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72982528 unmapped: 376832 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72990720 unmapped: 368640 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72990720 unmapped: 368640 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72998912 unmapped: 360448 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226b08000 session 0x559224554f00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 72998912 unmapped: 360448 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73015296 unmapped: 344064 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73015296 unmapped: 344064 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73015296 unmapped: 344064 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73023488 unmapped: 335872 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226b07000 session 0x559224ef0b40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73023488 unmapped: 335872 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73031680 unmapped: 327680 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73048064 unmapped: 311296 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73048064 unmapped: 311296 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73056256 unmapped: 303104 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73056256 unmapped: 303104 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73064448 unmapped: 294912 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829653 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73064448 unmapped: 294912 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73072640 unmapped: 286720 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 75.457023621s of 77.777244568s, submitted: 17
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73072640 unmapped: 286720 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73072640 unmapped: 286720 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73072640 unmapped: 286720 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831165 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73080832 unmapped: 278528 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73080832 unmapped: 278528 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73089024 unmapped: 270336 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73089024 unmapped: 270336 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73089024 unmapped: 270336 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 832086 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 245760 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 245760 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 245760 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73113600 unmapped: 245760 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73121792 unmapped: 237568 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835110 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.212070465s of 15.231811523s, submitted: 5
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73138176 unmapped: 221184 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73146368 unmapped: 212992 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73154560 unmapped: 204800 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 188416 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 188416 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 188416 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73170944 unmapped: 188416 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73179136 unmapped: 180224 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 172032 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 163840 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 163840 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261edc00 session 0x559226f241e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 163840 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 163840 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 155648 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833928 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 139264 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 71.693893433s of 71.732337952s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835440 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 131072 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226b00800 session 0x559224742000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 122880 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834258 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 40.526988983s of 41.823482513s, submitted: 3
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835770 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 106496 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 98304 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 98304 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 98304 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73285632 unmapped: 73728 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73285632 unmapped: 73728 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592247d7000 session 0x55922657e3c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73334784 unmapped: 24576 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226636c00 session 0x5592267a7860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835179 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 56.421627045s of 56.530124664s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 114688 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 98304 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559224eea800 session 0x559224743c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 81920 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73285632 unmapped: 73728 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 65536 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226af8000 session 0x559226feeb40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 49152 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834588 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 40960 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 57.254508972s of 57.258758545s, submitted: 1
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 32768 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 16384 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 8192 heap: 73359360 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 63.238079071s of 63.242374420s, submitted: 1
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837612 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261ec800 session 0x559226fee960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 110.972518921s of 111.781974792s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5807 writes, 24K keys, 5807 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5807 writes, 987 syncs, 5.88 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 440 writes, 717 keys, 440 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 440 writes, 204 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261f1400 session 0x559226fefc20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 93.923355103s of 93.927070618s, submitted: 1
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74350592 unmapped: 1105920 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1097728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1089536 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 1073152 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838014 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75448320 unmapped: 1056768 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 1048576 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 1048576 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.652290344s of 11.764292717s, submitted: 230
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840966 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226afa000 session 0x5592254723c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.259738922s of 41.266269684s, submitted: 3
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 841887 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 841887 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1974272 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 130.189498901s of 130.569747925s, submitted: 3
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75636736 unmapped: 1916928 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846191 data_alloc: 218103808 data_used: 40960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fca7a000/0x0/0x4ffc00000, data 0xed7f2/0x1a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 76718080 unmapped: 835584 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 16392192 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 141 ms_handle_reset con 0x559226af8800 session 0x559227226780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fba75000/0x0/0x4ffc00000, data 0x10ef970/0x11a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 16359424 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fba75000/0x0/0x4ffc00000, data 0x10ef970/0x11a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 16236544 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 142 ms_handle_reset con 0x559224eeb000 session 0x559227226d20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967505 data_alloc: 218103808 data_used: 45056
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6d000/0x0/0x4ffc00000, data 0x10f3bc6/0x11ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226b04c00 session 0x559226feeb40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226636c00 session 0x55922721e780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.924980164s of 34.470951080s, submitted: 51
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971447 data_alloc: 218103808 data_used: 45056
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226728800 session 0x55922677e960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971447 data_alloc: 218103808 data_used: 45056
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226af8400 session 0x559227227e00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x5592261e8400 session 0x55922723c5a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226634000 session 0x55922723c780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 16171008 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 16171008 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226afcc00 session 0x55922723c960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226afcc00 session 0x55922723cd20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 92913664 unmapped: 1425408 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226af6400 session 0x55922723cf00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 92938240 unmapped: 1400832 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.170117378s of 10.184672356s, submitted: 3
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fba67000/0x0/0x4ffc00000, data 0x10f7c84/0x11b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,7])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1000 session 0x55922723d0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93839360 unmapped: 17342464 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123797 data_alloc: 234881024 data_used: 13676544
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1800 session 0x55922723da40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93863936 unmapped: 17317888 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93855744 unmapped: 17326080 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e7c00 session 0x559226f24f00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93937664 unmapped: 17244160 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93937664 unmapped: 17244160 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1000 session 0x559226f24960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fac33000/0x0/0x4ffc00000, data 0x1f29dc4/0x1fe7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1800 session 0x55922657f860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93700096 unmapped: 17481728 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1128704 data_alloc: 234881024 data_used: 13676544
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93724672 unmapped: 17457152 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 99999744 unmapped: 11182080 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105676800 unmapped: 5505024 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105676800 unmapped: 5505024 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.482107162s of 10.092863083s, submitted: 62
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1227418 data_alloc: 234881024 data_used: 25862144
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225987 data_alloc: 234881024 data_used: 25862144
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109395968 unmapped: 3883008 heap: 113278976 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa78e000/0x0/0x4ffc00000, data 0x23ceda5/0x248e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108773376 unmapped: 9756672 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.997505188s of 10.222607613s, submitted: 78
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9eb8000/0x0/0x4ffc00000, data 0x2ca4da5/0x2d64000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111665152 unmapped: 6864896 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1356813 data_alloc: 251658240 data_used: 27123712
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 6561792 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c6c000/0x0/0x4ffc00000, data 0x2d4fda5/0x2e0f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1357269 data_alloc: 251658240 data_used: 27136000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 6373376 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c49000/0x0/0x4ffc00000, data 0x2d73da5/0x2e33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 6373376 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112173056 unmapped: 6356992 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112173056 unmapped: 6356992 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c49000/0x0/0x4ffc00000, data 0x2d73da5/0x2e33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354717 data_alloc: 251658240 data_used: 27205632
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.461258888s of 13.723365784s, submitted: 31
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112353280 unmapped: 6176768 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2d7cda5/0x2e3c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x55922723d4a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af7c00 session 0x55922723cb40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e6000 session 0x55922723dc20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1000 session 0x5592254712c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114376704 unmapped: 4153344 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1800 session 0x559225471a40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114376704 unmapped: 4153344 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x559225624f00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.490158081s of 14.116064072s, submitted: 3
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af7c00 session 0x559223b87e00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115531776 unmapped: 13500416 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115531776 unmapped: 13500416 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b6000/0x0/0x4ffc00000, data 0x3506da5/0x35c6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411121 data_alloc: 251658240 data_used: 29302784
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411257 data_alloc: 251658240 data_used: 29302784
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e2000 session 0x55922546f0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411409 data_alloc: 251658240 data_used: 29306880
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa400 session 0x5592247d8960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922721fa40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.360092163s of 15.706788063s, submitted: 14
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f3400 session 0x55922721fc20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 13418496 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120020992 unmapped: 9011200 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1456479 data_alloc: 251658240 data_used: 33521664
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 8978432 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b3000/0x0/0x4ffc00000, data 0x3507db5/0x35c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 8945664 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 8945664 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b3000/0x0/0x4ffc00000, data 0x3507db5/0x35c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1456615 data_alloc: 251658240 data_used: 33521664
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b1000/0x0/0x4ffc00000, data 0x350adb5/0x35cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b1000/0x0/0x4ffc00000, data 0x350adb5/0x35cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 9330688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.960206985s of 12.248162270s, submitted: 5
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1457327 data_alloc: 251658240 data_used: 33529856
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119889920 unmapped: 9142272 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120283136 unmapped: 8749056 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 7888896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 7888896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f2000/0x0/0x4ffc00000, data 0x38bbdb5/0x397c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1500921 data_alloc: 251658240 data_used: 33931264
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549ec00 session 0x55922669fe00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afd400 session 0x55922721e960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1495881 data_alloc: 251658240 data_used: 33931264
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.955549240s of 10.352662086s, submitted: 63
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922723c960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c2f000/0x0/0x4ffc00000, data 0x2d8dda5/0x2e4d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356198 data_alloc: 234881024 data_used: 25632768
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c2f000/0x0/0x4ffc00000, data 0x2d8dda5/0x2e4d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356198 data_alloc: 234881024 data_used: 25632768
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af6400 session 0x5592267a63c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.853686333s of 10.125116348s, submitted: 14
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afcc00 session 0x5592272292c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 20922368 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2800 session 0x55922723d4a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08000 session 0x559227227a40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.126308441s of 17.207635880s, submitted: 34
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1054257 data_alloc: 234881024 data_used: 15777792
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109297664 unmapped: 19734528 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e9800 session 0x559226feed20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108732416 unmapped: 20299776 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1053666 data_alloc: 234881024 data_used: 14729216
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x559226784d20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108339200 unmapped: 20692992 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c1000/0x0/0x4ffc00000, data 0x10fbdbf/0x11bb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108339200 unmapped: 20692992 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb000 session 0x55922721eb40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00000 session 0x55922669f0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afbc00 session 0x5592254eeb40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e9800 session 0x55922721fa40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1132656 data_alloc: 234881024 data_used: 14729216
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x55922721fc20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.649352074s of 13.054588318s, submitted: 40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108716032 unmapped: 24518656 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9f02000/0x0/0x4ffc00000, data 0x1abadf8/0x1b7a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb000 session 0x55922721e960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 24207360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 24207360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194396 data_alloc: 234881024 data_used: 20054016
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194396 data_alloc: 234881024 data_used: 20054016
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 22953984 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 22953984 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.911570549s of 12.916566849s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb400 session 0x559224554960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114982912 unmapped: 18251776 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115589120 unmapped: 17645568 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292618 data_alloc: 234881024 data_used: 21278720
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b6000/0x0/0x4ffc00000, data 0x24fddf8/0x25bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 17416192 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113991680 unmapped: 19243008 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113991680 unmapped: 19243008 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301298 data_alloc: 234881024 data_used: 21491712
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301618 data_alloc: 234881024 data_used: 21499904
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.075481415s of 14.293769836s, submitted: 87
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301498 data_alloc: 234881024 data_used: 21504000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300148 data_alloc: 234881024 data_used: 21504000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301820 data_alloc: 234881024 data_used: 21557248
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.004294395s of 14.015699387s, submitted: 3
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00000 session 0x55922721ef00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634800 session 0x5592267a63c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 19996672 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 19996672 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa16e000/0x0/0x4ffc00000, data 0x112cdf8/0x11ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1076098 data_alloc: 234881024 data_used: 10645504
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x55922723c780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa800 session 0x559226ffa960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676ec00 session 0x559226ffa780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af6400 session 0x559226ffab40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226ffb680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.319431305s of 30.379514694s, submitted: 28
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 26157056 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634800 session 0x559226ffa5a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e000 session 0x55922721f4a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06c00 session 0x55922721e5a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e4400 session 0x55922721e960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226ffb860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117062 data_alloc: 234881024 data_used: 10539008
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3ce000/0x0/0x4ffc00000, data 0x15ede08/0x16ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3ce000/0x0/0x4ffc00000, data 0x15ede08/0x16ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b01c00 session 0x5592247d9860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592267850e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc400 session 0x559226784780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 26206208 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117924 data_alloc: 234881024 data_used: 10539008
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06400 session 0x559226784960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152408 data_alloc: 234881024 data_used: 15626240
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152408 data_alloc: 234881024 data_used: 15626240
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.764934540s of 19.582212448s, submitted: 45
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152956 data_alloc: 234881024 data_used: 15638528
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_commit, latency = 5.380156040s
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_sync, latency = 5.380156517s
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.380500793s, txc = 0x559226356c00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114221056 unmapped: 19013632 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113123328 unmapped: 20111360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f963c000/0x0/0x4ffc00000, data 0x237ee18/0x2440000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,11])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 19062784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114221056 unmapped: 19013632 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f963c000/0x0/0x4ffc00000, data 0x237ee18/0x2440000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253386 data_alloc: 234881024 data_used: 16334848
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f960a000/0x0/0x4ffc00000, data 0x23b0e18/0x2472000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f960a000/0x0/0x4ffc00000, data 0x23b0e18/0x2472000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1263652 data_alloc: 234881024 data_used: 16482304
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.865746021s of 15.085161209s, submitted: 113
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9607000/0x0/0x4ffc00000, data 0x23b3e18/0x2475000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112648192 unmapped: 20586496 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261276 data_alloc: 234881024 data_used: 16486400
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261500 data_alloc: 234881024 data_used: 16486400
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112664576 unmapped: 20570112 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc000 session 0x559224ef1a40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9800 session 0x559224ef03c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x55922721f680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922721f4a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112664576 unmapped: 20570112 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.364326477s of 10.769536018s, submitted: 4
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9605000/0x0/0x4ffc00000, data 0x23b4e28/0x2477000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112672768 unmapped: 20561920 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112672768 unmapped: 20561920 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9c00 session 0x559227226960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x559226ffa1e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9800 session 0x55922723d680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 23486464 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1329277 data_alloc: 234881024 data_used: 16490496
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc000 session 0x5592265523c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x55922669e1e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922483b000 session 0x559226785c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e38000/0x0/0x4ffc00000, data 0x2b81e28/0x2c44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afdc00 session 0x5592247d8d20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592270fc5a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x559226fef680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e38000/0x0/0x4ffc00000, data 0x2b81e28/0x2c44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112599040 unmapped: 23789568 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113762304 unmapped: 22626304 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352731 data_alloc: 234881024 data_used: 19812352
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e37000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119627776 unmapped: 16760832 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383891 data_alloc: 234881024 data_used: 24457216
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119627776 unmapped: 16760832 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e37000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.307135582s of 14.417451859s, submitted: 28
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1384819 data_alloc: 234881024 data_used: 24469504
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e36000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121552896 unmapped: 14835712 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121905152 unmapped: 14483456 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123551744 unmapped: 12836864 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123551744 unmapped: 12836864 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8723000/0x0/0x4ffc00000, data 0x3295e38/0x3359000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442267 data_alloc: 234881024 data_used: 24694784
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.862577438s of 11.139899254s, submitted: 73
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123625472 unmapped: 12763136 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123658240 unmapped: 12730368 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123658240 unmapped: 12730368 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440939 data_alloc: 234881024 data_used: 24694784
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e3800 session 0x5592272270e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b03800 session 0x559226ffab40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 12722176 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,4])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592255661e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273528 data_alloc: 234881024 data_used: 16486400
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226f252c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592247d94a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b01400 session 0x5592263fe000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097099 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.934545517s of 17.322147369s, submitted: 73
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098611 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098611 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2687778899' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x559225566960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x55922657e960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x55922657f4a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x55922546e960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.590827942s of 24.071311951s, submitted: 2
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099528 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 23085056 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9e3a000/0x0/0x4ffc00000, data 0x1b82da6/0x1c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,6,11])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 22855680 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9e3a000/0x0/0x4ffc00000, data 0x1b82da6/0x1c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,17])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 22855680 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 29122560 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x55922546ef00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1800 session 0x5592267a72c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x5592267a6000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592263ffe00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x5592263feb40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1187056 data_alloc: 234881024 data_used: 10539008
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592263fef00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632000 session 0x559224742000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f99a9000/0x0/0x4ffc00000, data 0x1c03da6/0x1cc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226f25680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 33513472 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226f25e00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 33513472 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1233177 data_alloc: 234881024 data_used: 16625664
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 33120256 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268441 data_alloc: 234881024 data_used: 21921792
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1c00 session 0x55922546e5a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268897 data_alloc: 234881024 data_used: 21934080
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.883409500s of 21.952882767s, submitted: 22
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126459904 unmapped: 21479424 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c8d000/0x0/0x4ffc00000, data 0x2916dc9/0x29d7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,5])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124387328 unmapped: 23552000 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c6a000/0x0/0x4ffc00000, data 0x2939dc9/0x29fa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124616704 unmapped: 23322624 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372079 data_alloc: 234881024 data_used: 22802432
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 7795 writes, 32K keys, 7795 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 7795 writes, 1759 syncs, 4.43 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1988 writes, 7632 keys, 1988 commit groups, 1.0 writes per commit group, ingest: 8.26 MB, 0.01 MB/s#012Interval WAL: 1988 writes, 772 syncs, 2.58 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372095 data_alloc: 234881024 data_used: 22802432
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372095 data_alloc: 234881024 data_used: 22802432
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372247 data_alloc: 234881024 data_used: 22806528
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.538951874s of 21.155471802s, submitted: 103
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370448 data_alloc: 234881024 data_used: 22806528
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,4,0,6])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 15261696 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1400 session 0x559225017c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1500262 data_alloc: 234881024 data_used: 22806528
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79ef000/0x0/0x4ffc00000, data 0x3bbcdc9/0x3c7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08c00 session 0x559225473c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79ef000/0x0/0x4ffc00000, data 0x3bbcdc9/0x3c7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02800 session 0x55922721ef00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.648444176s of 12.136721611s, submitted: 17
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b03000 session 0x559226552960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x55922546fc20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122544128 unmapped: 33792000 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1502946 data_alloc: 234881024 data_used: 22810624
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122544128 unmapped: 33792000 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135053312 unmapped: 21282816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549f800 session 0x559225470b40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1625513 data_alloc: 251658240 data_used: 41050112
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139075584 unmapped: 17260544 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139149312 unmapped: 17186816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139288576 unmapped: 17047552 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1625426 data_alloc: 251658240 data_used: 41050112
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139460608 unmapped: 16875520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.569581985s of 12.389707565s, submitted: 232
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142147584 unmapped: 14188544 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7566000/0x0/0x4ffc00000, data 0x4044dc9/0x4105000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b05400 session 0x5592250174a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142516224 unmapped: 13819904 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142516224 unmapped: 13819904 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b09000 session 0x55922669e780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248ef000 session 0x559225566960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.538358688s of 15.629971504s, submitted: 40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x5592255ff0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1381072 data_alloc: 234881024 data_used: 22806528
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382584 data_alloc: 234881024 data_used: 22806528
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382584 data_alloc: 234881024 data_used: 22806528
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afd000 session 0x559226f24960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e8400 session 0x55922669e3c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.595676422s of 14.635678291s, submitted: 13
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226636c00 session 0x559226553e00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02400 session 0x55922723d860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x5592267a63c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f3400 session 0x55922721e5a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x55922721fa40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226728800 session 0x559225016780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.576278687s of 31.081003189s, submitted: 24
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb000 session 0x559226ffb860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592263fe1e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559226f25680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261edc00 session 0x559226f250e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4400 session 0x559226f25860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 37027840 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 37027840 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559225471860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194678 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119324672 unmapped: 37011456 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb000 session 0x559225472b40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261edc00 session 0x559226f252c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa800 session 0x559226f241e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 36855808 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c40000/0x0/0x4ffc00000, data 0x196cdf8/0x1a2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 36839424 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 36839424 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c1b000/0x0/0x4ffc00000, data 0x1990e08/0x1a51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260962 data_alloc: 234881024 data_used: 19304448
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 36528128 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.900426865s of 10.097883224s, submitted: 55
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1000 session 0x559227226b40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x559224742b40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x55922669e780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c1b000/0x0/0x4ffc00000, data 0x1990e08/0x1a51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x559226785a40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592250174a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eea400 session 0x559225017680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559225016000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.801321030s of 13.943515778s, submitted: 49
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,15])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559225017c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x5592254705a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x559225473e00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1c00 session 0x55922721eb40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x55922721f0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f937a000/0x0/0x4ffc00000, data 0x2232da6/0x22f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264351 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e4800 session 0x559224ef01e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f937a000/0x0/0x4ffc00000, data 0x2232da6/0x22f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x5592255661e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303380 data_alloc: 234881024 data_used: 15511552
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 43442176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125919232 unmapped: 38813696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125952000 unmapped: 38780928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.294668198s of 11.932563782s, submitted: 37
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549f000 session 0x55922657fe00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e400 session 0x5592272270e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125952000 unmapped: 38780928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118267904 unmapped: 46465024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226552b40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 46399488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: mgrc ms_handle_reset ms_handle_reset con 0x5592249d8000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/4198923246
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/4198923246,v1:192.168.122.100:6801/4198923246]
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: mgrc handle_mgr_configure stats_period=5
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x559225471a40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.419353485s of 23.247339249s, submitted: 36
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f23000 session 0x559226784f00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559227766000 session 0x559226785c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00800 session 0x5592254725a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592254730e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.210437775s of 15.619210243s, submitted: 1
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592263ff2c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220376 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb400 session 0x5592270fd860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x5592270fc960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d7400 session 0x559225567c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118620160 unmapped: 46112768 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118620160 unmapped: 46112768 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226ffbc20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221149 data_alloc: 234881024 data_used: 10539008
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279061 data_alloc: 234881024 data_used: 19120128
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279061 data_alloc: 234881024 data_used: 19120128
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.080177307s of 20.722246170s, submitted: 23
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 43720704 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9265000/0x0/0x4ffc00000, data 0x2340d96/0x23ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126017536 unmapped: 38715392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 38182912 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1390845 data_alloc: 234881024 data_used: 19333120
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 38182912 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec0000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126566400 unmapped: 38166528 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 39575552 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec9000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec9000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382581 data_alloc: 234881024 data_used: 19349504
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1384645 data_alloc: 234881024 data_used: 19349504
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ea1000/0x0/0x4ffc00000, data 0x270cd96/0x27cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592270fc000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.293542862s of 14.293901443s, submitted: 116
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226f250e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x55922677f2c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d5800 session 0x5592272292c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226fee3c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592254efe00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226785c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.185562134s of 22.248020172s, submitted: 32
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x5592254725a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e6000 session 0x55922723c960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x55922657e1e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x5592270fde00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x559225473680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209420 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3c800 session 0x559225017a40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa07c000/0x0/0x4ffc00000, data 0x1530da6/0x15f0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225656800 session 0x55922669f2c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x559226f24780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226ffa3c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 45318144 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216924 data_alloc: 234881024 data_used: 10543104
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 45318144 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa056000/0x0/0x4ffc00000, data 0x1554dd9/0x1616000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1246108 data_alloc: 234881024 data_used: 14667776
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa056000/0x0/0x4ffc00000, data 0x1554dd9/0x1616000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1246108 data_alloc: 234881024 data_used: 14667776
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.542800903s of 17.671800613s, submitted: 41
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122945536 unmapped: 41787392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124780544 unmapped: 39952384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9869000/0x0/0x4ffc00000, data 0x1d41dd9/0x1e03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125403136 unmapped: 39329792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f97dc000/0x0/0x4ffc00000, data 0x1dc8dd9/0x1e8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226864c00 session 0x55922677f860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226863400 session 0x55922721f0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922723b000 session 0x559226aeb680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922723b000 session 0x559225016000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124108800 unmapped: 40624128 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x55922669e1e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x55922657f4a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226863400 session 0x559224649680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226864c00 session 0x559224ef0f00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225656c00 session 0x55922723cf00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386172 data_alloc: 234881024 data_used: 16515072
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 39067648 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 39067648 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9129000/0x0/0x4ffc00000, data 0x2477e4b/0x253b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634400 session 0x55922723da40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1379724 data_alloc: 234881024 data_used: 16515072
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549c400 session 0x5592267852c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x559225625680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x5592270fd860
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.054047585s of 10.986348152s, submitted: 167
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 37986304 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 37978112 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128802816 unmapped: 35930112 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419509 data_alloc: 234881024 data_used: 22065152
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419509 data_alloc: 234881024 data_used: 22065152
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.671233177s of 10.673833847s, submitted: 1
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128876544 unmapped: 35856384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910c000/0x0/0x4ffc00000, data 0x249be6e/0x2560000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 133996544 unmapped: 30736384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1494901 data_alloc: 234881024 data_used: 23621632
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 133668864 unmapped: 31064064 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1505309 data_alloc: 234881024 data_used: 24363008
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.098537445s of 12.288821220s, submitted: 94
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1502669 data_alloc: 234881024 data_used: 24371200
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135053312 unmapped: 29679616 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d2000/0x0/0x4ffc00000, data 0x2cd4e6e/0x2d99000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d2000/0x0/0x4ffc00000, data 0x2cd4e6e/0x2d99000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1503037 data_alloc: 234881024 data_used: 24436736
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88cd000/0x0/0x4ffc00000, data 0x2cdae6e/0x2d9f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x55922669f2c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 29663232 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135094272 unmapped: 29638656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348975 data_alloc: 234881024 data_used: 16515072
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.806308746s of 10.239793777s, submitted: 37
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88ce000/0x0/0x4ffc00000, data 0x2cdae5e/0x2d9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,2])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dfc/0x2009000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dfc/0x2009000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f23400 session 0x559226784f00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347011 data_alloc: 234881024 data_used: 16498688
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347011 data_alloc: 234881024 data_used: 16498688
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.272031784s of 13.655331612s, submitted: 34
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x55922546f0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1346027 data_alloc: 234881024 data_used: 16498688
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3c800 session 0x5592254ef0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127320064 unmapped: 37412864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1199806 data_alloc: 234881024 data_used: 10649600
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa48d000/0x0/0x4ffc00000, data 0x111fdb9/0x11df000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226865000 session 0x559223b86960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ef000 session 0x559226aea3c0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657800 session 0x5592263feb40
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657400 session 0x559226fef0e0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3ac00 session 0x55922669f680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.472564697s of 42.319786072s, submitted: 56
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127377408 unmapped: 37355520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x5592247d85a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657400 session 0x5592263fe780
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657800 session 0x55922721fe00
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156edcf/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1234189 data_alloc: 234881024 data_used: 10539008
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ef000 session 0x559224648000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3ac00 session 0x559226fee960
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1234133 data_alloc: 234881024 data_used: 10539008
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 2.432877302s of 11.771712303s, submitted: 33
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b0a000 session 0x559225473c20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1235130 data_alloc: 234881024 data_used: 10539008
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265986 data_alloc: 234881024 data_used: 15028224
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265986 data_alloc: 234881024 data_used: 15028224
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.148657799s of 15.006252289s, submitted: 6
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132644864 unmapped: 32088064 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302734 data_alloc: 234881024 data_used: 15024128
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b43000/0x0/0x4ffc00000, data 0x1a68e08/0x1b29000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,2,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130285568 unmapped: 34447360 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129679360 unmapped: 35053568 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f997a000/0x0/0x4ffc00000, data 0x1c29e08/0x1cea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,4])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129687552 unmapped: 35045376 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328778 data_alloc: 234881024 data_used: 15020032
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96ff000/0x0/0x4ffc00000, data 0x1eace08/0x1f6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130875392 unmapped: 33857536 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1.013013244s of 10.206089973s, submitted: 61
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96ff000/0x0/0x4ffc00000, data 0x1eace08/0x1f6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1333106 data_alloc: 234881024 data_used: 15360000
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96e2000/0x0/0x4ffc00000, data 0x1ec9e08/0x1f8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348124 data_alloc: 234881024 data_used: 15777792
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.314796448s of 11.320782661s, submitted: 31
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261f1400 session 0x55922669fc20
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f22000 session 0x559226fef4a0
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129081344 unmapped: 35651584 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96e2000/0x0/0x4ffc00000, data 0x1ec9e08/0x1f8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207674 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129089536 unmapped: 35643392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa000 session 0x559226aeb680
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129179648 unmapped: 35553280 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129179648 unmapped: 35553280 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129228800 unmapped: 35504128 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}'
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'config show' '{prefix=config show}'
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128770048 unmapped: 35962880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128917504 unmapped: 35815424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:30:16 np0005593295 ceph-osd[81231]: do_command 'log dump' '{prefix=log dump}'
Jan 23 05:30:16 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:30:16 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4224081289' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:16 np0005593295 nova_compute[225701]: 2026-01-23 10:30:16.654 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.775s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:16 np0005593295 nova_compute[225701]: 2026-01-23 10:30:16.660 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:30:16 np0005593295 nova_compute[225701]: 2026-01-23 10:30:16.686 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:30:16 np0005593295 nova_compute[225701]: 2026-01-23 10:30:16.688 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:30:16 np0005593295 nova_compute[225701]: 2026-01-23 10:30:16.688 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.961s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 23 05:30:17 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/737140567' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 05:30:17 np0005593295 nova_compute[225701]: 2026-01-23 10:30:17.111 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:17.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:17.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:30:17 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/232885380' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:30:17 np0005593295 nova_compute[225701]: 2026-01-23 10:30:17.689 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:17 np0005593295 nova_compute[225701]: 2026-01-23 10:30:17.690 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:17 np0005593295 nova_compute[225701]: 2026-01-23 10:30:17.690 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 05:30:18 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/641666540' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 05:30:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:18 np0005593295 nova_compute[225701]: 2026-01-23 10:30:18.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 05:30:18 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2869222915' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 05:30:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:19 np0005593295 nova_compute[225701]: 2026-01-23 10:30:19.344 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:19.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:19.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 05:30:19 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3233910781' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 05:30:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 23 05:30:19 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3682934773' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 05:30:20 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/341034865' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 05:30:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 05:30:20 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1621252379' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 05:30:20 np0005593295 nova_compute[225701]: 2026-01-23 10:30:20.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:20 np0005593295 nova_compute[225701]: 2026-01-23 10:30:20.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:30:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:21 np0005593295 systemd[1]: Starting Hostname Service...
Jan 23 05:30:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:21 np0005593295 systemd[1]: Started Hostname Service.
Jan 23 05:30:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:21.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:21.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:22 np0005593295 nova_compute[225701]: 2026-01-23 10:30:22.114 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:23.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:23.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 23 05:30:23 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2294747835' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 05:30:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:24 np0005593295 nova_compute[225701]: 2026-01-23 10:30:24.346 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 23 05:30:24 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2182990666' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 05:30:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:25.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:25.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4080595786' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 23 05:30:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2610036103' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 05:30:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 23 05:30:27 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3569729491' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 05:30:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:27 np0005593295 nova_compute[225701]: 2026-01-23 10:30:27.200 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 23 05:30:27 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/896818155' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 05:30:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:27.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 23 05:30:27 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2736972453' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 05:30:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:27.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 23 05:30:27 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2037715614' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 05:30:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/26149318' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 05:30:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1004606240' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4061140840' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/631308360' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 23 05:30:28 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1961868252' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 05:30:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 23 05:30:29 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3026965908' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 05:30:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:29 np0005593295 nova_compute[225701]: 2026-01-23 10:30:29.347 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 23 05:30:29 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1997317476' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 05:30:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:29.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:29.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 23 05:30:29 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/629032271' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 05:30:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:31.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:31.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:32 np0005593295 nova_compute[225701]: 2026-01-23 10:30:32.202 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 23 05:30:33 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3880234350' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 05:30:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:33.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:33.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 23 05:30:33 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3916140662' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 05:30:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:34 np0005593295 nova_compute[225701]: 2026-01-23 10:30:34.348 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:34 np0005593295 podman[241963]: 2026-01-23 10:30:34.658548426 +0000 UTC m=+0.060040057 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:30:34 np0005593295 podman[241962]: 2026-01-23 10:30:34.698450683 +0000 UTC m=+0.099536205 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:30:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 23 05:30:34 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3359416428' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 05:30:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:35.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:35 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:30:35 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:30:35 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:30:35 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:30:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 23 05:30:36 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/25636672' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 05:30:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:37 np0005593295 nova_compute[225701]: 2026-01-23 10:30:37.204 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:37.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:37 np0005593295 ovs-appctl[242834]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 05:30:37 np0005593295 ovs-appctl[242839]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 05:30:37 np0005593295 ovs-appctl[242843]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 05:30:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:37.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:37 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 23 05:30:37 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1362458980' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 05:30:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 23 05:30:38 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/419775306' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 05:30:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 23 05:30:38 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/372859611' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 05:30:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 23 05:30:39 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1920671338' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 05:30:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:39 np0005593295 nova_compute[225701]: 2026-01-23 10:30:39.349 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:39.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:39.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Jan 23 05:30:40 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/50610532' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 05:30:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:30:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6595 writes, 34K keys, 6595 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 6594 writes, 6594 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1522 writes, 7535 keys, 1522 commit groups, 1.0 writes per commit group, ingest: 17.69 MB, 0.03 MB/s#012Interval WAL: 1521 writes, 1521 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     22.9      2.13              0.20        18    0.118       0      0       0.0       0.0#012  L6      1/0   13.93 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.4     84.8     72.8      2.92              0.88        17    0.172     94K   9319       0.0       0.0#012 Sum      1/0   13.93 MB   0.0      0.2     0.0      0.2       0.3      0.1       0.0   5.4     49.0     51.8      5.06              1.08        35    0.144     94K   9319       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.6     24.9     25.6      2.49              0.20         8    0.311     26K   2540       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0     84.8     72.8      2.92              0.88        17    0.172     94K   9319       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     23.0      2.13              0.20        17    0.125       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.048, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.26 GB write, 0.11 MB/s write, 0.24 GB read, 0.10 MB/s read, 5.1 seconds#012Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.10 MB/s read, 2.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 22.75 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000186 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1398,22.02 MB,7.24371%) FilterBlock(35,274.42 KB,0.0881546%) IndexBlock(35,473.86 KB,0.152221%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:30:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Jan 23 05:30:40 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1965759482' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 05:30:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:41.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Jan 23 05:30:41 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/22422002' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 05:30:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:41.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:42 np0005593295 nova_compute[225701]: 2026-01-23 10:30:42.206 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:42 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Jan 23 05:30:42 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1755609551' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 05:30:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Jan 23 05:30:43 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3090469002' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 05:30:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:43.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:43.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:44 np0005593295 nova_compute[225701]: 2026-01-23 10:30:44.392 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Jan 23 05:30:44 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4118000991' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 05:30:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Jan 23 05:30:44 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3681758551' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 05:30:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:45.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:45.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 05:30:46 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1942062714' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 05:30:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:47 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:47 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 23 05:30:47 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3530506813' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 05:30:47 np0005593295 nova_compute[225701]: 2026-01-23 10:30:47.208 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:47.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:47.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:47 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:30:47 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:47 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:47 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:30:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 05:30:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2304269384' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 05:30:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Jan 23 05:30:48 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3049937054' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 05:30:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:49 np0005593295 nova_compute[225701]: 2026-01-23 10:30:49.392 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:49.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:49.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:49 np0005593295 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 05:30:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:50 np0005593295 systemd[1]: Starting Time & Date Service...
Jan 23 05:30:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:50 np0005593295 systemd[1]: Started Time & Date Service.
Jan 23 05:30:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Jan 23 05:30:50 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3027541159' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 05:30:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:51 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Jan 23 05:30:51 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2631473347' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 05:30:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:51.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:51 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Jan 23 05:30:51 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3298491043' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:51.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:52 np0005593295 nova_compute[225701]: 2026-01-23 10:30:52.211 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Jan 23 05:30:52 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/814440401' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 05:30:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:53.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Jan 23 05:30:53 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1998309136' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 05:30:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:53.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Jan 23 05:30:54 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2533419223' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 05:30:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:54 np0005593295 nova_compute[225701]: 2026-01-23 10:30:54.395 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:54 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:54 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:30:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:30:55.499 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:30:55.500 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:30:55.501 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:55.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 05:30:55 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1831285676' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 05:30:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:55.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Jan 23 05:30:55 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2165199586' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 05:30:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:57 np0005593295 nova_compute[225701]: 2026-01-23 10:30:57.214 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:57.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:30:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:57.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:30:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Jan 23 05:30:58 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4212707187' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 05:30:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Jan 23 05:30:58 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4094141977' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 05:30:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:30:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:30:59 np0005593295 nova_compute[225701]: 2026-01-23 10:30:59.435 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:59.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:30:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:59.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:30:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:01.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:01.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:02 np0005593295 nova_compute[225701]: 2026-01-23 10:31:02.216 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:03.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:31:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:03.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:31:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:03 np0005593295 nova_compute[225701]: 2026-01-23 10:31:03.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:03 np0005593295 nova_compute[225701]: 2026-01-23 10:31:03.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:31:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:04 np0005593295 nova_compute[225701]: 2026-01-23 10:31:04.482 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:05.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:05 np0005593295 podman[245946]: 2026-01-23 10:31:05.636614886 +0000 UTC m=+0.055599738 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:31:05 np0005593295 podman[245945]: 2026-01-23 10:31:05.668050853 +0000 UTC m=+0.087347182 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:31:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:05.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:06 np0005593295 nova_compute[225701]: 2026-01-23 10:31:06.800 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:07 np0005593295 nova_compute[225701]: 2026-01-23 10:31:07.218 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:07.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:07.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:09 np0005593295 nova_compute[225701]: 2026-01-23 10:31:09.521 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:09.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:09.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:10 np0005593295 nova_compute[225701]: 2026-01-23 10:31:10.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:10 np0005593295 nova_compute[225701]: 2026-01-23 10:31:10.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:31:10 np0005593295 nova_compute[225701]: 2026-01-23 10:31:10.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:31:10 np0005593295 nova_compute[225701]: 2026-01-23 10:31:10.800 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:31:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:11.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:11.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:12 np0005593295 nova_compute[225701]: 2026-01-23 10:31:12.220 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:13.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:13.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:14 np0005593295 nova_compute[225701]: 2026-01-23 10:31:14.524 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:14 np0005593295 nova_compute[225701]: 2026-01-23 10:31:14.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:15.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:15.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:15 np0005593295 nova_compute[225701]: 2026-01-23 10:31:15.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:16 np0005593295 nova_compute[225701]: 2026-01-23 10:31:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:16 np0005593295 nova_compute[225701]: 2026-01-23 10:31:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:16 np0005593295 nova_compute[225701]: 2026-01-23 10:31:16.825 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:16 np0005593295 nova_compute[225701]: 2026-01-23 10:31:16.825 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:16 np0005593295 nova_compute[225701]: 2026-01-23 10:31:16.825 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:16 np0005593295 nova_compute[225701]: 2026-01-23 10:31:16.826 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:31:16 np0005593295 nova_compute[225701]: 2026-01-23 10:31:16.826 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.222 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:31:17 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/177182351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.329 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.463 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.464 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4657MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.465 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.465 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:17.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.647 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.647 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:31:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:17.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:17 np0005593295 nova_compute[225701]: 2026-01-23 10:31:17.717 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:31:18 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/339495991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:18 np0005593295 nova_compute[225701]: 2026-01-23 10:31:18.295 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.578s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:18 np0005593295 nova_compute[225701]: 2026-01-23 10:31:18.301 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:31:18 np0005593295 nova_compute[225701]: 2026-01-23 10:31:18.318 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:31:18 np0005593295 nova_compute[225701]: 2026-01-23 10:31:18.320 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:31:18 np0005593295 nova_compute[225701]: 2026-01-23 10:31:18.320 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:19 np0005593295 nova_compute[225701]: 2026-01-23 10:31:19.320 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:19 np0005593295 nova_compute[225701]: 2026-01-23 10:31:19.338 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:19 np0005593295 nova_compute[225701]: 2026-01-23 10:31:19.528 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:19.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:19.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:19 np0005593295 nova_compute[225701]: 2026-01-23 10:31:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:20 np0005593295 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 05:31:20 np0005593295 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 05:31:20 np0005593295 nova_compute[225701]: 2026-01-23 10:31:20.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:20 np0005593295 nova_compute[225701]: 2026-01-23 10:31:20.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:31:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:21.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:21.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:21 np0005593295 nova_compute[225701]: 2026-01-23 10:31:21.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:22 np0005593295 nova_compute[225701]: 2026-01-23 10:31:22.283 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:23.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:23.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:24 np0005593295 nova_compute[225701]: 2026-01-23 10:31:24.530 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:25.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:25.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:27 np0005593295 nova_compute[225701]: 2026-01-23 10:31:27.185 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:27 np0005593295 nova_compute[225701]: 2026-01-23 10:31:27.185 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:31:27 np0005593295 nova_compute[225701]: 2026-01-23 10:31:27.200 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:31:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:27 np0005593295 nova_compute[225701]: 2026-01-23 10:31:27.289 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:27.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:27.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:28 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:29.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:29 np0005593295 nova_compute[225701]: 2026-01-23 10:31:29.584 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:29.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:31.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:31.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:32 np0005593295 nova_compute[225701]: 2026-01-23 10:31:32.293 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:33.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:33.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:33 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:34 np0005593295 nova_compute[225701]: 2026-01-23 10:31:34.589 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:35.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:35.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:36 np0005593295 podman[246118]: 2026-01-23 10:31:36.449819294 +0000 UTC m=+0.058704604 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:31:36 np0005593295 podman[246117]: 2026-01-23 10:31:36.488714326 +0000 UTC m=+0.099579715 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:31:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:37 np0005593295 nova_compute[225701]: 2026-01-23 10:31:37.296 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:37.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:37.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:38 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:39.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:39 np0005593295 nova_compute[225701]: 2026-01-23 10:31:39.593 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 05:31:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:39.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 05:31:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:41.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:41.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:42 np0005593295 nova_compute[225701]: 2026-01-23 10:31:42.299 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:43.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:43.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:43 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:44 np0005593295 nova_compute[225701]: 2026-01-23 10:31:44.594 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:31:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:45.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:31:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:45.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:47 np0005593295 nova_compute[225701]: 2026-01-23 10:31:47.300 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:47.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:47.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:48 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:49 np0005593295 nova_compute[225701]: 2026-01-23 10:31:49.596 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:49.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:49.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:50 np0005593295 systemd-logind[786]: Session 55 logged out. Waiting for processes to exit.
Jan 23 05:31:50 np0005593295 systemd[1]: session-55.scope: Deactivated successfully.
Jan 23 05:31:50 np0005593295 systemd[1]: session-55.scope: Consumed 2min 57.559s CPU time, 747.7M memory peak, read 312.0M from disk, written 64.1M to disk.
Jan 23 05:31:50 np0005593295 systemd-logind[786]: Removed session 55.
Jan 23 05:31:50 np0005593295 systemd-logind[786]: New session 56 of user zuul.
Jan 23 05:31:50 np0005593295 systemd[1]: Started Session 56 of User zuul.
Jan 23 05:31:50 np0005593295 nova_compute[225701]: 2026-01-23 10:31:50.678 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:50 np0005593295 systemd[1]: session-56.scope: Deactivated successfully.
Jan 23 05:31:50 np0005593295 systemd-logind[786]: Session 56 logged out. Waiting for processes to exit.
Jan 23 05:31:50 np0005593295 systemd-logind[786]: Removed session 56.
Jan 23 05:31:50 np0005593295 systemd-logind[786]: New session 57 of user zuul.
Jan 23 05:31:50 np0005593295 systemd[1]: Started Session 57 of User zuul.
Jan 23 05:31:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:51 np0005593295 systemd[1]: session-57.scope: Deactivated successfully.
Jan 23 05:31:51 np0005593295 systemd-logind[786]: Session 57 logged out. Waiting for processes to exit.
Jan 23 05:31:51 np0005593295 systemd-logind[786]: Removed session 57.
Jan 23 05:31:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:51.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:52 np0005593295 nova_compute[225701]: 2026-01-23 10:31:52.302 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:53.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:53.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:54 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:31:54 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:31:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:54 np0005593295 nova_compute[225701]: 2026-01-23 10:31:54.640 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:55 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:31:55 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:31:55 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:31:55 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:31:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:31:55.501 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:31:55.502 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:31:55.502 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:55.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:55.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:57 np0005593295 nova_compute[225701]: 2026-01-23 10:31:57.305 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:57.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:57.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:31:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:31:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:31:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:31:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:59.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:31:59 np0005593295 nova_compute[225701]: 2026-01-23 10:31:59.640 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:31:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:59.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:00 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:32:00 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:32:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:01.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:01.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:32:02 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 9973 writes, 39K keys, 9973 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 9973 writes, 2660 syncs, 3.75 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2178 writes, 7712 keys, 2178 commit groups, 1.0 writes per commit group, ingest: 7.91 MB, 0.01 MB/s#012Interval WAL: 2178 writes, 901 syncs, 2.42 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:32:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:02 np0005593295 nova_compute[225701]: 2026-01-23 10:32:02.308 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.341154) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322341366, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2850, "num_deletes": 506, "total_data_size": 6397733, "memory_usage": 6490536, "flush_reason": "Manual Compaction"}
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322366528, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2686587, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33630, "largest_seqno": 36474, "table_properties": {"data_size": 2676761, "index_size": 5040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3717, "raw_key_size": 31915, "raw_average_key_size": 21, "raw_value_size": 2652031, "raw_average_value_size": 1807, "num_data_blocks": 215, "num_entries": 1467, "num_filter_entries": 1467, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164133, "oldest_key_time": 1769164133, "file_creation_time": 1769164322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 25403 microseconds, and 8139 cpu microseconds.
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.366625) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2686587 bytes OK
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.366653) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.369094) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.369113) EVENT_LOG_v1 {"time_micros": 1769164322369109, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.369129) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6383313, prev total WAL file size 6383313, number of live WAL files 2.
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.370806) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303034' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2623KB)], [63(13MB)]
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322370945, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 17293732, "oldest_snapshot_seqno": -1}
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6858 keys, 14416410 bytes, temperature: kUnknown
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322495827, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 14416410, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14372124, "index_size": 26062, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 176986, "raw_average_key_size": 25, "raw_value_size": 14250340, "raw_average_value_size": 2077, "num_data_blocks": 1044, "num_entries": 6858, "num_filter_entries": 6858, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.496262) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 14416410 bytes
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.498681) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 138.3 rd, 115.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 13.9 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(11.8) write-amplify(5.4) OK, records in: 7802, records dropped: 944 output_compression: NoCompression
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.498712) EVENT_LOG_v1 {"time_micros": 1769164322498698, "job": 38, "event": "compaction_finished", "compaction_time_micros": 125013, "compaction_time_cpu_micros": 33641, "output_level": 6, "num_output_files": 1, "total_output_size": 14416410, "num_input_records": 7802, "num_output_records": 6858, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322499813, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322504483, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.370552) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504593) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504600) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:02 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:02.504609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:03.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:32:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:03.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:32:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:04 np0005593295 nova_compute[225701]: 2026-01-23 10:32:04.642 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:05.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:05.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:06 np0005593295 podman[246453]: 2026-01-23 10:32:06.644213904 +0000 UTC m=+0.061719303 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:32:06 np0005593295 podman[246452]: 2026-01-23 10:32:06.712137697 +0000 UTC m=+0.129981644 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:32:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:07 np0005593295 nova_compute[225701]: 2026-01-23 10:32:07.309 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:07.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:07.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:08 np0005593295 nova_compute[225701]: 2026-01-23 10:32:08.796 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:09.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:09 np0005593295 nova_compute[225701]: 2026-01-23 10:32:09.645 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:09.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:10 np0005593295 nova_compute[225701]: 2026-01-23 10:32:10.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:10 np0005593295 nova_compute[225701]: 2026-01-23 10:32:10.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:32:10 np0005593295 nova_compute[225701]: 2026-01-23 10:32:10.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:32:10 np0005593295 nova_compute[225701]: 2026-01-23 10:32:10.803 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:32:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:11.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:11.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:12 np0005593295 nova_compute[225701]: 2026-01-23 10:32:12.310 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:32:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:13.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:32:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:13.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:13 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:14 np0005593295 nova_compute[225701]: 2026-01-23 10:32:14.647 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:15.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:15.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:15 np0005593295 nova_compute[225701]: 2026-01-23 10:32:15.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:16 np0005593295 nova_compute[225701]: 2026-01-23 10:32:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:16 np0005593295 nova_compute[225701]: 2026-01-23 10:32:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:16 np0005593295 nova_compute[225701]: 2026-01-23 10:32:16.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:16 np0005593295 nova_compute[225701]: 2026-01-23 10:32:16.819 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:16 np0005593295 nova_compute[225701]: 2026-01-23 10:32:16.820 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:16 np0005593295 nova_compute[225701]: 2026-01-23 10:32:16.820 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:16 np0005593295 nova_compute[225701]: 2026-01-23 10:32:16.820 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:32:16 np0005593295 nova_compute[225701]: 2026-01-23 10:32:16.820 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.312 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3707941147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.347 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.504 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.506 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4850MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.506 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.507 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:17.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:17.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.886895) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337886953, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 400, "num_deletes": 251, "total_data_size": 456611, "memory_usage": 464200, "flush_reason": "Manual Compaction"}
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337891440, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 298037, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36480, "largest_seqno": 36874, "table_properties": {"data_size": 295738, "index_size": 463, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5740, "raw_average_key_size": 18, "raw_value_size": 291160, "raw_average_value_size": 948, "num_data_blocks": 20, "num_entries": 307, "num_filter_entries": 307, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164322, "oldest_key_time": 1769164322, "file_creation_time": 1769164337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 4586 microseconds, and 2073 cpu microseconds.
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.891493) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 298037 bytes OK
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.891510) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893780) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893801) EVENT_LOG_v1 {"time_micros": 1769164337893795, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893818) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 454028, prev total WAL file size 454028, number of live WAL files 2.
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.894361) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(291KB)], [66(13MB)]
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337894405, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 14714447, "oldest_snapshot_seqno": -1}
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.959 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.959 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:32:17 np0005593295 nova_compute[225701]: 2026-01-23 10:32:17.985 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6655 keys, 12553762 bytes, temperature: kUnknown
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337987140, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 12553762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12512252, "index_size": 23798, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16645, "raw_key_size": 173472, "raw_average_key_size": 26, "raw_value_size": 12395417, "raw_average_value_size": 1862, "num_data_blocks": 943, "num_entries": 6655, "num_filter_entries": 6655, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.987469) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 12553762 bytes
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.989239) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.4 rd, 135.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 13.7 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(91.5) write-amplify(42.1) OK, records in: 7165, records dropped: 510 output_compression: NoCompression
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.989281) EVENT_LOG_v1 {"time_micros": 1769164337989262, "job": 40, "event": "compaction_finished", "compaction_time_micros": 92873, "compaction_time_cpu_micros": 30914, "output_level": 6, "num_output_files": 1, "total_output_size": 12553762, "num_input_records": 7165, "num_output_records": 6655, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337989578, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337996665, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.894251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:17 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:32:17.996800) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:32:18 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4119112767' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:18 np0005593295 nova_compute[225701]: 2026-01-23 10:32:18.477 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:18 np0005593295 nova_compute[225701]: 2026-01-23 10:32:18.484 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:32:18 np0005593295 nova_compute[225701]: 2026-01-23 10:32:18.511 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:32:18 np0005593295 nova_compute[225701]: 2026-01-23 10:32:18.513 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:32:18 np0005593295 nova_compute[225701]: 2026-01-23 10:32:18.513 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:19 np0005593295 nova_compute[225701]: 2026-01-23 10:32:19.648 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:19.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:19.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:20 np0005593295 nova_compute[225701]: 2026-01-23 10:32:20.513 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:20 np0005593295 nova_compute[225701]: 2026-01-23 10:32:20.513 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:21.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:21.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:21 np0005593295 nova_compute[225701]: 2026-01-23 10:32:21.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:21 np0005593295 nova_compute[225701]: 2026-01-23 10:32:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:32:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:22 np0005593295 nova_compute[225701]: 2026-01-23 10:32:22.316 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:23.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:23.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:24 np0005593295 nova_compute[225701]: 2026-01-23 10:32:24.649 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:25.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:25.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:27 np0005593295 nova_compute[225701]: 2026-01-23 10:32:27.319 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:27.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:27.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:29 np0005593295 nova_compute[225701]: 2026-01-23 10:32:29.694 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:29.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:31.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:31.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:32 np0005593295 nova_compute[225701]: 2026-01-23 10:32:32.321 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:33.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:33.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:34 np0005593295 nova_compute[225701]: 2026-01-23 10:32:34.696 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:35 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 05:32:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:35.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:35 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 05:32:35 np0005593295 radosgw[82185]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 05:32:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:35.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:37 np0005593295 nova_compute[225701]: 2026-01-23 10:32:37.324 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:37 np0005593295 podman[246618]: 2026-01-23 10:32:37.632485729 +0000 UTC m=+0.048397265 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:32:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:37.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:37 np0005593295 podman[246617]: 2026-01-23 10:32:37.684196836 +0000 UTC m=+0.101167018 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:32:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:37.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:39.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:39 np0005593295 nova_compute[225701]: 2026-01-23 10:32:39.698 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:39.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:41.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:41.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:42 np0005593295 nova_compute[225701]: 2026-01-23 10:32:42.370 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:32:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:43.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:32:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:43.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:44 np0005593295 nova_compute[225701]: 2026-01-23 10:32:44.723 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:45.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:45.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:47 np0005593295 nova_compute[225701]: 2026-01-23 10:32:47.372 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:47.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:47.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:49.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:49 np0005593295 nova_compute[225701]: 2026-01-23 10:32:49.726 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:49.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:51.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:51.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:52 np0005593295 nova_compute[225701]: 2026-01-23 10:32:52.375 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:53.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:32:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:53.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:32:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:54 np0005593295 nova_compute[225701]: 2026-01-23 10:32:54.728 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:32:55.502 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:32:55.503 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:32:55.503 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:32:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:55.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:32:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:32:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:55.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:32:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:57 np0005593295 nova_compute[225701]: 2026-01-23 10:32:57.378 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:57.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:57.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:32:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:32:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:32:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:59.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:59 np0005593295 nova_compute[225701]: 2026-01-23 10:32:59.731 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:32:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:59.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:33:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:33:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:33:01 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:33:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:01.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:01.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:02 np0005593295 nova_compute[225701]: 2026-01-23 10:33:02.380 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:03.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:03.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:04 np0005593295 nova_compute[225701]: 2026-01-23 10:33:04.734 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:05.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:05.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:33:05 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:33:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:07 np0005593295 nova_compute[225701]: 2026-01-23 10:33:07.382 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.463886) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387463926, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 743, "num_deletes": 251, "total_data_size": 1499624, "memory_usage": 1527592, "flush_reason": "Manual Compaction"}
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387471352, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 980518, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36880, "largest_seqno": 37617, "table_properties": {"data_size": 976921, "index_size": 1441, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7221, "raw_average_key_size": 17, "raw_value_size": 969770, "raw_average_value_size": 2298, "num_data_blocks": 62, "num_entries": 422, "num_filter_entries": 422, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164337, "oldest_key_time": 1769164337, "file_creation_time": 1769164387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 7520 microseconds, and 3402 cpu microseconds.
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.471406) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 980518 bytes OK
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.471424) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.473414) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.473428) EVENT_LOG_v1 {"time_micros": 1769164387473423, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.473445) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1495708, prev total WAL file size 1495708, number of live WAL files 2.
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.474081) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353033' seq:0, type:0; will stop at (end)
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(957KB)], [69(11MB)]
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387474206, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 13534280, "oldest_snapshot_seqno": -1}
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6561 keys, 12131420 bytes, temperature: kUnknown
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387561925, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12131420, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12090525, "index_size": 23375, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16453, "raw_key_size": 173192, "raw_average_key_size": 26, "raw_value_size": 11975107, "raw_average_value_size": 1825, "num_data_blocks": 913, "num_entries": 6561, "num_filter_entries": 6561, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.562170) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12131420 bytes
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.564130) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 154.1 rd, 138.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.0 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(26.2) write-amplify(12.4) OK, records in: 7077, records dropped: 516 output_compression: NoCompression
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.564147) EVENT_LOG_v1 {"time_micros": 1769164387564140, "job": 42, "event": "compaction_finished", "compaction_time_micros": 87813, "compaction_time_cpu_micros": 28686, "output_level": 6, "num_output_files": 1, "total_output_size": 12131420, "num_input_records": 7077, "num_output_records": 6561, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387564468, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387566537, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.473918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566641) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:33:07.566652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:33:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:07.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:07.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:08 np0005593295 podman[246845]: 2026-01-23 10:33:08.343675587 +0000 UTC m=+0.054919107 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:33:08 np0005593295 podman[246844]: 2026-01-23 10:33:08.409318464 +0000 UTC m=+0.123844914 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:33:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:09.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:09 np0005593295 nova_compute[225701]: 2026-01-23 10:33:09.764 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:09 np0005593295 nova_compute[225701]: 2026-01-23 10:33:09.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:09.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:11.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:33:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:11.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:33:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:12 np0005593295 nova_compute[225701]: 2026-01-23 10:33:12.385 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:12 np0005593295 nova_compute[225701]: 2026-01-23 10:33:12.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:12 np0005593295 nova_compute[225701]: 2026-01-23 10:33:12.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:33:12 np0005593295 nova_compute[225701]: 2026-01-23 10:33:12.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:33:12 np0005593295 nova_compute[225701]: 2026-01-23 10:33:12.806 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:33:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:13.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:13.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:14 np0005593295 nova_compute[225701]: 2026-01-23 10:33:14.768 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:15.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:15.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:16 np0005593295 nova_compute[225701]: 2026-01-23 10:33:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:17 np0005593295 nova_compute[225701]: 2026-01-23 10:33:17.387 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:17.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:17 np0005593295 nova_compute[225701]: 2026-01-23 10:33:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:17 np0005593295 nova_compute[225701]: 2026-01-23 10:33:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:33:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:17.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.017 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.017 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.018 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.018 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.018 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:18 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:33:18 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1310870223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.504 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.657 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.658 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4846MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.658 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.659 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.781 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.781 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.866 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.932 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:33:18 np0005593295 nova_compute[225701]: 2026-01-23 10:33:18.933 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:33:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.153 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.180 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.229 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:33:19 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3715483574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.709 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.715 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:33:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:33:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:19.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.806 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:33:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:19.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.822 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.824 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:33:19 np0005593295 nova_compute[225701]: 2026-01-23 10:33:19.825 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:20 np0005593295 nova_compute[225701]: 2026-01-23 10:33:20.825 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:20 np0005593295 nova_compute[225701]: 2026-01-23 10:33:20.825 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:20 np0005593295 nova_compute[225701]: 2026-01-23 10:33:20.825 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:21.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:21.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:22 np0005593295 nova_compute[225701]: 2026-01-23 10:33:22.389 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:22 np0005593295 nova_compute[225701]: 2026-01-23 10:33:22.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:22 np0005593295 nova_compute[225701]: 2026-01-23 10:33:22.796 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:22 np0005593295 nova_compute[225701]: 2026-01-23 10:33:22.797 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:33:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:23.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:33:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:23.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:33:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:24 np0005593295 nova_compute[225701]: 2026-01-23 10:33:24.808 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:25.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:25.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:27 np0005593295 nova_compute[225701]: 2026-01-23 10:33:27.391 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:27.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:33:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:27.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:33:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:29.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:29 np0005593295 nova_compute[225701]: 2026-01-23 10:33:29.810 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:29.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:31.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:33:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:31.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:33:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:32 np0005593295 nova_compute[225701]: 2026-01-23 10:33:32.394 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:33.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:33.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:34 np0005593295 nova_compute[225701]: 2026-01-23 10:33:34.812 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:35.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:35.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:37 np0005593295 nova_compute[225701]: 2026-01-23 10:33:37.396 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:37.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:33:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:37.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:33:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:38 np0005593295 podman[246989]: 2026-01-23 10:33:38.682639138 +0000 UTC m=+0.095206423 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 23 05:33:38 np0005593295 podman[246988]: 2026-01-23 10:33:38.688079271 +0000 UTC m=+0.108521799 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:33:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:39.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:39 np0005593295 nova_compute[225701]: 2026-01-23 10:33:39.813 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:39.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:41.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:41.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:42 np0005593295 nova_compute[225701]: 2026-01-23 10:33:42.398 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:43.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:33:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:43.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:33:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:44 np0005593295 nova_compute[225701]: 2026-01-23 10:33:44.815 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:45.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:45.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:47 np0005593295 nova_compute[225701]: 2026-01-23 10:33:47.400 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:33:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:47.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:33:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:33:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:47.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:33:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:49.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:49 np0005593295 nova_compute[225701]: 2026-01-23 10:33:49.818 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:49.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:51.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:33:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:51.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:33:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:52 np0005593295 nova_compute[225701]: 2026-01-23 10:33:52.403 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:53.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:53.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:54 np0005593295 nova_compute[225701]: 2026-01-23 10:33:54.819 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:33:55.504 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:33:55.504 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:33:55.505 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:33:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:55.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:33:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:33:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:55.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:33:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:57 np0005593295 nova_compute[225701]: 2026-01-23 10:33:57.405 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:57.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:33:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:57.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:33:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:33:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:33:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:33:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:59.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:59 np0005593295 nova_compute[225701]: 2026-01-23 10:33:59.821 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:33:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:59.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:01.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:01.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:02 np0005593295 nova_compute[225701]: 2026-01-23 10:34:02.407 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:34:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:03.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:34:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:03.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:04 np0005593295 nova_compute[225701]: 2026-01-23 10:34:04.823 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:05.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:05.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:06 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:34:06 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:34:06 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:34:06 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:34:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:07 np0005593295 nova_compute[225701]: 2026-01-23 10:34:07.420 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:07.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:07.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:09 np0005593295 podman[247197]: 2026-01-23 10:34:09.625673651 +0000 UTC m=+0.048605841 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:34:09 np0005593295 podman[247196]: 2026-01-23 10:34:09.693510352 +0000 UTC m=+0.121271250 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:34:09 np0005593295 nova_compute[225701]: 2026-01-23 10:34:09.825 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:09.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:09.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:10 np0005593295 nova_compute[225701]: 2026-01-23 10:34:10.797 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:11.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:11.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:34:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:34:12 np0005593295 nova_compute[225701]: 2026-01-23 10:34:12.422 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:13.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:13.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:14 np0005593295 nova_compute[225701]: 2026-01-23 10:34:14.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:14 np0005593295 nova_compute[225701]: 2026-01-23 10:34:14.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:34:14 np0005593295 nova_compute[225701]: 2026-01-23 10:34:14.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:34:14 np0005593295 nova_compute[225701]: 2026-01-23 10:34:14.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:34:14 np0005593295 nova_compute[225701]: 2026-01-23 10:34:14.825 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:15.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:15.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:16 np0005593295 nova_compute[225701]: 2026-01-23 10:34:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:17 np0005593295 nova_compute[225701]: 2026-01-23 10:34:17.424 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:17 np0005593295 nova_compute[225701]: 2026-01-23 10:34:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:34:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:17.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:34:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:17.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:19 np0005593295 nova_compute[225701]: 2026-01-23 10:34:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:19 np0005593295 nova_compute[225701]: 2026-01-23 10:34:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:19 np0005593295 nova_compute[225701]: 2026-01-23 10:34:19.827 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:19.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:19.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.069 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.069 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.069 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.070 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.070 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:20 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:34:20 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/505590307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.534 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.702 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.703 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4852MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.703 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:20 np0005593295 nova_compute[225701]: 2026-01-23 10:34:20.704 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:21 np0005593295 nova_compute[225701]: 2026-01-23 10:34:21.259 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:34:21 np0005593295 nova_compute[225701]: 2026-01-23 10:34:21.259 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:34:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:21 np0005593295 nova_compute[225701]: 2026-01-23 10:34:21.279 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:34:21 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1417754147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:21 np0005593295 nova_compute[225701]: 2026-01-23 10:34:21.732 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:21 np0005593295 nova_compute[225701]: 2026-01-23 10:34:21.739 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:34:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:21.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:21.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:21 np0005593295 nova_compute[225701]: 2026-01-23 10:34:21.968 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:34:21 np0005593295 nova_compute[225701]: 2026-01-23 10:34:21.969 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:34:21 np0005593295 nova_compute[225701]: 2026-01-23 10:34:21.970 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:22 np0005593295 nova_compute[225701]: 2026-01-23 10:34:22.426 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:22 np0005593295 nova_compute[225701]: 2026-01-23 10:34:22.970 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:22 np0005593295 nova_compute[225701]: 2026-01-23 10:34:22.970 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:23 np0005593295 nova_compute[225701]: 2026-01-23 10:34:23.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:23 np0005593295 nova_compute[225701]: 2026-01-23 10:34:23.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:34:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:23.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:23.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:24 np0005593295 nova_compute[225701]: 2026-01-23 10:34:24.831 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:25.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:25.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:27 np0005593295 nova_compute[225701]: 2026-01-23 10:34:27.428 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:34:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:27.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:34:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:34:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:27.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:34:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:29 np0005593295 nova_compute[225701]: 2026-01-23 10:34:29.832 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:29.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:29.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:31.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:31.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:32 np0005593295 nova_compute[225701]: 2026-01-23 10:34:32.430 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:33.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:33.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:34 np0005593295 nova_compute[225701]: 2026-01-23 10:34:34.836 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:35.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:35.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:37 np0005593295 nova_compute[225701]: 2026-01-23 10:34:37.432 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:37.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:39 np0005593295 nova_compute[225701]: 2026-01-23 10:34:39.836 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:34:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:39.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:34:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:40 np0005593295 podman[247366]: 2026-01-23 10:34:40.623133771 +0000 UTC m=+0.047938565 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 05:34:40 np0005593295 podman[247365]: 2026-01-23 10:34:40.65371343 +0000 UTC m=+0.078839123 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:34:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:41.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:41.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:42 np0005593295 nova_compute[225701]: 2026-01-23 10:34:42.435 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:43.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:43.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:44 np0005593295 nova_compute[225701]: 2026-01-23 10:34:44.837 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:45.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:45.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:47 np0005593295 nova_compute[225701]: 2026-01-23 10:34:47.436 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:47.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:47.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:49 np0005593295 nova_compute[225701]: 2026-01-23 10:34:49.840 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:49.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:49.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:51.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:51.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:52 np0005593295 nova_compute[225701]: 2026-01-23 10:34:52.438 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:34:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:53.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:34:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000047s ======
Jan 23 05:34:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:53.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000047s
Jan 23 05:34:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:54 np0005593295 nova_compute[225701]: 2026-01-23 10:34:54.841 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:34:55.505 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:34:55.506 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:34:55.506 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:55.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:55.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:57 np0005593295 nova_compute[225701]: 2026-01-23 10:34:57.441 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:57.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:57.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:34:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:34:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:34:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:59 np0005593295 nova_compute[225701]: 2026-01-23 10:34:59.843 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:34:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:59.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:34:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:34:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:59.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:01.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:01.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:02 np0005593295 nova_compute[225701]: 2026-01-23 10:35:02.444 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:03.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:03.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:04 np0005593295 nova_compute[225701]: 2026-01-23 10:35:04.846 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:05.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:05 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:05 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:05 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:07 np0005593295 nova_compute[225701]: 2026-01-23 10:35:07.446 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:07.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:07 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:07 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:07 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:09 np0005593295 nova_compute[225701]: 2026-01-23 10:35:09.847 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:09.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:09 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:09 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:09 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:09.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:11 np0005593295 podman[247490]: 2026-01-23 10:35:11.62705722 +0000 UTC m=+0.047775161 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 05:35:11 np0005593295 podman[247488]: 2026-01-23 10:35:11.650814852 +0000 UTC m=+0.077643702 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 05:35:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:11.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:11 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:11 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:35:11 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:11.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:35:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:12 np0005593295 podman[247652]: 2026-01-23 10:35:12.183158358 +0000 UTC m=+0.069337958 container exec 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 23 05:35:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:12 np0005593295 podman[247652]: 2026-01-23 10:35:12.310157499 +0000 UTC m=+0.196337069 container exec_died 40f46bf40aa2a1dd44c1667f18474817146de143b9a9094e785510ebdd8206e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:35:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:35:12 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:35:12 np0005593295 nova_compute[225701]: 2026-01-23 10:35:12.448 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:12 np0005593295 podman[247756]: 2026-01-23 10:35:12.634808379 +0000 UTC m=+0.050316913 container exec 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:35:12 np0005593295 podman[247756]: 2026-01-23 10:35:12.650279948 +0000 UTC m=+0.065788492 container exec_died 610246eaa4f552466bc8eee8cc806d516dfcdb85055d982f77b2276e24631ff0 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 05:35:12 np0005593295 nova_compute[225701]: 2026-01-23 10:35:12.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:13 np0005593295 podman[247910]: 2026-01-23 10:35:13.232005515 +0000 UTC m=+0.050396325 container exec c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 05:35:13 np0005593295 podman[247910]: 2026-01-23 10:35:13.240432061 +0000 UTC m=+0.058822831 container exec_died c9fabe49ecac94420d698684f7f0ce6311c7531d91804b2e676e15edf98f3e26 (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-2-bbaqsj)
Jan 23 05:35:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:13 np0005593295 podman[247975]: 2026-01-23 10:35:13.419794314 +0000 UTC m=+0.047454204 container exec 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, version=2.2.4, vendor=Red Hat, Inc., distribution-scope=public, name=keepalived, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, release=1793, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 23 05:35:13 np0005593295 podman[247975]: 2026-01-23 10:35:13.43394857 +0000 UTC m=+0.061608440 container exec_died 5f2fa996bf6c111f1884c2f59ff9b058d96f90c6e8935be1b7683b38269ec7aa (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, version=2.2.4, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived)
Jan 23 05:35:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:13.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:13 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:13 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:13 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:13.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:14 np0005593295 nova_compute[225701]: 2026-01-23 10:35:14.848 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:15 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:15 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:15.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:15 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:15 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:35:15 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:15.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:35:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:16 np0005593295 nova_compute[225701]: 2026-01-23 10:35:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:16 np0005593295 nova_compute[225701]: 2026-01-23 10:35:16.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:35:16 np0005593295 nova_compute[225701]: 2026-01-23 10:35:16.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:35:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:35:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:35:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:17 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:35:17 np0005593295 nova_compute[225701]: 2026-01-23 10:35:17.204 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:35:17 np0005593295 nova_compute[225701]: 2026-01-23 10:35:17.205 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:17 np0005593295 nova_compute[225701]: 2026-01-23 10:35:17.490 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:17.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:17 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:17 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:17 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:17.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:18 np0005593295 nova_compute[225701]: 2026-01-23 10:35:18.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:19 np0005593295 nova_compute[225701]: 2026-01-23 10:35:19.850 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:35:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:19.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:35:19 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:19 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:19 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:19.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:21 np0005593295 nova_compute[225701]: 2026-01-23 10:35:21.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:21 np0005593295 nova_compute[225701]: 2026-01-23 10:35:21.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:21 np0005593295 nova_compute[225701]: 2026-01-23 10:35:21.810 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:21 np0005593295 nova_compute[225701]: 2026-01-23 10:35:21.811 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:21 np0005593295 nova_compute[225701]: 2026-01-23 10:35:21.811 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:21 np0005593295 nova_compute[225701]: 2026-01-23 10:35:21.811 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:35:21 np0005593295 nova_compute[225701]: 2026-01-23 10:35:21.812 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:21.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:21 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:21 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:21 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:21.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:22 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:35:22 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3397794839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.265 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.419 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.420 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4791MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.420 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.420 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.493 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.586 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.587 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:35:22 np0005593295 nova_compute[225701]: 2026-01-23 10:35:22.660 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:35:23 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/342726147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:35:23 np0005593295 nova_compute[225701]: 2026-01-23 10:35:23.111 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:23 np0005593295 nova_compute[225701]: 2026-01-23 10:35:23.116 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:35:23 np0005593295 nova_compute[225701]: 2026-01-23 10:35:23.191 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:35:23 np0005593295 nova_compute[225701]: 2026-01-23 10:35:23.193 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:35:23 np0005593295 nova_compute[225701]: 2026-01-23 10:35:23.193 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:23.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:23 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:23 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:23 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:23.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:24 np0005593295 nova_compute[225701]: 2026-01-23 10:35:24.193 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:24 np0005593295 nova_compute[225701]: 2026-01-23 10:35:24.194 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:24 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:35:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:24 np0005593295 nova_compute[225701]: 2026-01-23 10:35:24.852 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:25 np0005593295 nova_compute[225701]: 2026-01-23 10:35:25.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:25 np0005593295 nova_compute[225701]: 2026-01-23 10:35:25.856 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:25 np0005593295 nova_compute[225701]: 2026-01-23 10:35:25.856 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:35:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:25 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:25 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:25 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:25.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:27 np0005593295 nova_compute[225701]: 2026-01-23 10:35:27.536 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:35:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:27.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:35:27 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:27 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:27 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:27.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:29 np0005593295 nova_compute[225701]: 2026-01-23 10:35:29.720 225706 DEBUG oslo_concurrency.processutils [None req-a8510dbf-d677-4163-b681-0279df98cd8c 00aca23f964f49a5a9abfea9744e871b 5220cd4f58cb43bb899e367e961bc5c1 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:29 np0005593295 nova_compute[225701]: 2026-01-23 10:35:29.754 225706 DEBUG oslo_concurrency.processutils [None req-a8510dbf-d677-4163-b681-0279df98cd8c 00aca23f964f49a5a9abfea9744e871b 5220cd4f58cb43bb899e367e961bc5c1 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:29 np0005593295 nova_compute[225701]: 2026-01-23 10:35:29.854 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:29.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:29 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:29 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:29 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:29.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:31.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:31 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:31 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:31 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:31.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:32 np0005593295 nova_compute[225701]: 2026-01-23 10:35:32.538 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:33.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:33 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:33 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:33 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:33.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:34 np0005593295 nova_compute[225701]: 2026-01-23 10:35:34.857 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:35 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:35:35.630 142606 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:35:35 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:35:35.631 142606 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:35:35 np0005593295 nova_compute[225701]: 2026-01-23 10:35:35.673 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:35.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:35 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:35 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:35 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:35.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:37 np0005593295 nova_compute[225701]: 2026-01-23 10:35:37.541 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:37.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:37 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:37 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:35:37 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:37.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:35:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:39 np0005593295 nova_compute[225701]: 2026-01-23 10:35:39.859 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:35:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:39.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:35:39 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:39 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:39 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:39.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:41.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:41 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:41 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:41 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:41.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:42 np0005593295 nova_compute[225701]: 2026-01-23 10:35:42.544 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:42 np0005593295 podman[248246]: 2026-01-23 10:35:42.63270285 +0000 UTC m=+0.051553853 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 23 05:35:42 np0005593295 podman[248245]: 2026-01-23 10:35:42.667768339 +0000 UTC m=+0.086776606 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:35:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:35:43 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:43.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:35:43 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c1ba5d0 =====
Jan 23 05:35:43 np0005593295 radosgw[82185]: ====== req done req=0x7f821c1ba5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:43 np0005593295 radosgw[82185]: beast: 0x7f821c1ba5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:43.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:44 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:35:44.634 142606 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fb585ea-168c-48ac-870f-617a4fa1bbde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:44 np0005593295 nova_compute[225701]: 2026-01-23 10:35:44.860 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:45.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:45 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:45 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:45 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:45.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:47 np0005593295 nova_compute[225701]: 2026-01-23 10:35:47.546 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:47.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:47 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:47 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:47 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:47.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:49 np0005593295 nova_compute[225701]: 2026-01-23 10:35:49.862 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:49.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:49 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:49 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:49 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:49.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:51.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:51 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:51 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:35:51 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:51.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:35:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:52 np0005593295 nova_compute[225701]: 2026-01-23 10:35:52.548 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:53.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:53 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:53 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:53 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:53.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:54 np0005593295 nova_compute[225701]: 2026-01-23 10:35:54.865 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.934602) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164554934873, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1870, "num_deletes": 251, "total_data_size": 4935695, "memory_usage": 5020648, "flush_reason": "Manual Compaction"}
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164554957333, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3204080, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37622, "largest_seqno": 39487, "table_properties": {"data_size": 3196224, "index_size": 4735, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16131, "raw_average_key_size": 20, "raw_value_size": 3180644, "raw_average_value_size": 3995, "num_data_blocks": 201, "num_entries": 796, "num_filter_entries": 796, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164388, "oldest_key_time": 1769164388, "file_creation_time": 1769164554, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 22743 microseconds, and 8166 cpu microseconds.
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.957419) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3204080 bytes OK
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.957450) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.959004) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.959024) EVENT_LOG_v1 {"time_micros": 1769164554959020, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.959046) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 4927370, prev total WAL file size 4927370, number of live WAL files 2.
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.960562) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3128KB)], [72(11MB)]
Jan 23 05:35:54 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164554960706, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15335500, "oldest_snapshot_seqno": -1}
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6839 keys, 13114719 bytes, temperature: kUnknown
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555055191, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13114719, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13071128, "index_size": 25367, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 179703, "raw_average_key_size": 26, "raw_value_size": 12949802, "raw_average_value_size": 1893, "num_data_blocks": 991, "num_entries": 6839, "num_filter_entries": 6839, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164554, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.055557) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13114719 bytes
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.057209) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.1 rd, 138.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 11.6 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(8.9) write-amplify(4.1) OK, records in: 7357, records dropped: 518 output_compression: NoCompression
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.057228) EVENT_LOG_v1 {"time_micros": 1769164555057218, "job": 44, "event": "compaction_finished", "compaction_time_micros": 94610, "compaction_time_cpu_micros": 28685, "output_level": 6, "num_output_files": 1, "total_output_size": 13114719, "num_input_records": 7357, "num_output_records": 6839, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:35:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555057968, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555060238, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:54.960370) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060280) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:55 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:35:55.060317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:35:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:35:55.507 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:35:55.507 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:35:55.507 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:35:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:55.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:35:55 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:55 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:55 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:55.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:57 np0005593295 nova_compute[225701]: 2026-01-23 10:35:57.593 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:35:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:57.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:35:57 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:57 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:35:57 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:57.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:35:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:35:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:35:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:35:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:59 np0005593295 nova_compute[225701]: 2026-01-23 10:35:59.914 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:59.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:59 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:35:59 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:59 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:59.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:01.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:01 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:01 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:01 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:01.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:02 np0005593295 nova_compute[225701]: 2026-01-23 10:36:02.595 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:03 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:03 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:36:03 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:03.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:36:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:03.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:04 np0005593295 nova_compute[225701]: 2026-01-23 10:36:04.916 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:05.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:06.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:07 np0005593295 nova_compute[225701]: 2026-01-23 10:36:07.599 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:08.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:08.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:09 np0005593295 nova_compute[225701]: 2026-01-23 10:36:09.939 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:10.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:10.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:10 np0005593295 nova_compute[225701]: 2026-01-23 10:36:10.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:10 np0005593295 nova_compute[225701]: 2026-01-23 10:36:10.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:36:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:12.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:12.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:12 np0005593295 nova_compute[225701]: 2026-01-23 10:36:12.602 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:12 np0005593295 nova_compute[225701]: 2026-01-23 10:36:12.972 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:13 np0005593295 podman[248367]: 2026-01-23 10:36:13.662806705 +0000 UTC m=+0.082275536 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 05:36:13 np0005593295 podman[248368]: 2026-01-23 10:36:13.663846021 +0000 UTC m=+0.083125977 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:36:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:14.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:14.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:14 np0005593295 nova_compute[225701]: 2026-01-23 10:36:14.942 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:16.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:16.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:16 np0005593295 nova_compute[225701]: 2026-01-23 10:36:16.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:17 np0005593295 nova_compute[225701]: 2026-01-23 10:36:17.604 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:17 np0005593295 nova_compute[225701]: 2026-01-23 10:36:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:17 np0005593295 nova_compute[225701]: 2026-01-23 10:36:17.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:36:17 np0005593295 nova_compute[225701]: 2026-01-23 10:36:17.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:36:17 np0005593295 nova_compute[225701]: 2026-01-23 10:36:17.929 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:36:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:18.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:18.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:19 np0005593295 nova_compute[225701]: 2026-01-23 10:36:19.944 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:20.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:20.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:20 np0005593295 nova_compute[225701]: 2026-01-23 10:36:20.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:22.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:22.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:22 np0005593295 nova_compute[225701]: 2026-01-23 10:36:22.606 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:22 np0005593295 nova_compute[225701]: 2026-01-23 10:36:22.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:22 np0005593295 nova_compute[225701]: 2026-01-23 10:36:22.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:22 np0005593295 nova_compute[225701]: 2026-01-23 10:36:22.918 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:22 np0005593295 nova_compute[225701]: 2026-01-23 10:36:22.919 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:22 np0005593295 nova_compute[225701]: 2026-01-23 10:36:22.919 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:22 np0005593295 nova_compute[225701]: 2026-01-23 10:36:22.919 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:36:22 np0005593295 nova_compute[225701]: 2026-01-23 10:36:22.920 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:23 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:36:23 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4289693492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:23 np0005593295 nova_compute[225701]: 2026-01-23 10:36:23.358 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:23 np0005593295 nova_compute[225701]: 2026-01-23 10:36:23.534 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:36:23 np0005593295 nova_compute[225701]: 2026-01-23 10:36:23.536 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4848MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:36:23 np0005593295 nova_compute[225701]: 2026-01-23 10:36:23.536 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:23 np0005593295 nova_compute[225701]: 2026-01-23 10:36:23.536 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:23 np0005593295 nova_compute[225701]: 2026-01-23 10:36:23.909 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:36:23 np0005593295 nova_compute[225701]: 2026-01-23 10:36:23.909 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:36:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:24.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:24 np0005593295 nova_compute[225701]: 2026-01-23 10:36:24.048 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:36:24 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1368383346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:24 np0005593295 nova_compute[225701]: 2026-01-23 10:36:24.477 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:24 np0005593295 nova_compute[225701]: 2026-01-23 10:36:24.483 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:24 np0005593295 nova_compute[225701]: 2026-01-23 10:36:24.947 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:25 np0005593295 nova_compute[225701]: 2026-01-23 10:36:25.222 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:25 np0005593295 nova_compute[225701]: 2026-01-23 10:36:25.224 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:36:25 np0005593295 nova_compute[225701]: 2026-01-23 10:36:25.224 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:26.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:26.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:26 np0005593295 nova_compute[225701]: 2026-01-23 10:36:26.225 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:26 np0005593295 nova_compute[225701]: 2026-01-23 10:36:26.225 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:26 np0005593295 nova_compute[225701]: 2026-01-23 10:36:26.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:26 np0005593295 nova_compute[225701]: 2026-01-23 10:36:26.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:36:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:27 np0005593295 nova_compute[225701]: 2026-01-23 10:36:27.608 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:27 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:27 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:27 np0005593295 nova_compute[225701]: 2026-01-23 10:36:27.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:28.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:28.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:36:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:28 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:36:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:29 np0005593295 nova_compute[225701]: 2026-01-23 10:36:29.950 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:30.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:30.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:32.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:32.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:32 np0005593295 nova_compute[225701]: 2026-01-23 10:36:32.612 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:34.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:34.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:34 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:34 np0005593295 nova_compute[225701]: 2026-01-23 10:36:34.951 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:35 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:36:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:36.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:36.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:36 np0005593295 nova_compute[225701]: 2026-01-23 10:36:36.837 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:36 np0005593295 nova_compute[225701]: 2026-01-23 10:36:36.837 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:36:36 np0005593295 nova_compute[225701]: 2026-01-23 10:36:36.936 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:36:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:37 np0005593295 nova_compute[225701]: 2026-01-23 10:36:37.614 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:38.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:38.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:39 np0005593295 nova_compute[225701]: 2026-01-23 10:36:39.953 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:40.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:40.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:42.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:42.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:42 np0005593295 nova_compute[225701]: 2026-01-23 10:36:42.617 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:44.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:44.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:44 np0005593295 podman[248621]: 2026-01-23 10:36:44.622909618 +0000 UTC m=+0.051558484 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:36:44 np0005593295 podman[248620]: 2026-01-23 10:36:44.656614773 +0000 UTC m=+0.085572516 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:36:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:44 np0005593295 nova_compute[225701]: 2026-01-23 10:36:44.955 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:46.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:46.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:47 np0005593295 nova_compute[225701]: 2026-01-23 10:36:47.619 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:48.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:48.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:49 np0005593295 nova_compute[225701]: 2026-01-23 10:36:49.955 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:50.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:50.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:52.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:52.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:52 np0005593295 nova_compute[225701]: 2026-01-23 10:36:52.622 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:54.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:54.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:54 np0005593295 nova_compute[225701]: 2026-01-23 10:36:54.957 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:36:55.508 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:36:55.509 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:36:55.509 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:56.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:56.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:57 np0005593295 nova_compute[225701]: 2026-01-23 10:36:57.625 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:58.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:36:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:36:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:58.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:36:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:36:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:36:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:36:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:59 np0005593295 nova_compute[225701]: 2026-01-23 10:36:59.959 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:00.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:37:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:00.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:37:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:02.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:02.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:02 np0005593295 nova_compute[225701]: 2026-01-23 10:37:02.670 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:04.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:04.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:05 np0005593295 nova_compute[225701]: 2026-01-23 10:37:05.005 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:06.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:06.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:07 np0005593295 nova_compute[225701]: 2026-01-23 10:37:07.673 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:08.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:08.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:10 np0005593295 nova_compute[225701]: 2026-01-23 10:37:10.007 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:10.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:10.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:12.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:12.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:12 np0005593295 nova_compute[225701]: 2026-01-23 10:37:12.675 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:14.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:14.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:14 np0005593295 nova_compute[225701]: 2026-01-23 10:37:14.877 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:15 np0005593295 nova_compute[225701]: 2026-01-23 10:37:15.009 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:15 np0005593295 podman[248745]: 2026-01-23 10:37:15.622389727 +0000 UTC m=+0.046092589 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:37:15 np0005593295 podman[248744]: 2026-01-23 10:37:15.660556852 +0000 UTC m=+0.088573491 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 05:37:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:16.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:16.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:17 np0005593295 nova_compute[225701]: 2026-01-23 10:37:17.678 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:17 np0005593295 nova_compute[225701]: 2026-01-23 10:37:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:18.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:18.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:19 np0005593295 nova_compute[225701]: 2026-01-23 10:37:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:19 np0005593295 nova_compute[225701]: 2026-01-23 10:37:19.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:37:19 np0005593295 nova_compute[225701]: 2026-01-23 10:37:19.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:37:19 np0005593295 nova_compute[225701]: 2026-01-23 10:37:19.801 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:37:20 np0005593295 nova_compute[225701]: 2026-01-23 10:37:20.011 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:20.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:20.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:21 np0005593295 nova_compute[225701]: 2026-01-23 10:37:21.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:22.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:22.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:22 np0005593295 nova_compute[225701]: 2026-01-23 10:37:22.681 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:23 np0005593295 nova_compute[225701]: 2026-01-23 10:37:23.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:23 np0005593295 nova_compute[225701]: 2026-01-23 10:37:23.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:24.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:24.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.400 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.401 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.401 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.401 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.402 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:37:24 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/583159563' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.852 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.993 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.995 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4863MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.995 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:24 np0005593295 nova_compute[225701]: 2026-01-23 10:37:24.995 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.012 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.391 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.392 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.503 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:37:25 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/574915668' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.956 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.961 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.979 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.980 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:37:25 np0005593295 nova_compute[225701]: 2026-01-23 10:37:25.981 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:26.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:26.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:27 np0005593295 nova_compute[225701]: 2026-01-23 10:37:27.711 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:27 np0005593295 nova_compute[225701]: 2026-01-23 10:37:27.981 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:27 np0005593295 nova_compute[225701]: 2026-01-23 10:37:27.982 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:27 np0005593295 nova_compute[225701]: 2026-01-23 10:37:27.982 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:27 np0005593295 nova_compute[225701]: 2026-01-23 10:37:27.982 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:37:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:28.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:28.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:28 np0005593295 nova_compute[225701]: 2026-01-23 10:37:28.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:30 np0005593295 nova_compute[225701]: 2026-01-23 10:37:30.013 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:37:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:30.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:37:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:30.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:32.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:32.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:32 np0005593295 nova_compute[225701]: 2026-01-23 10:37:32.714 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:34.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:34.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:35 np0005593295 nova_compute[225701]: 2026-01-23 10:37:35.016 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:36.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:36.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:37 np0005593295 nova_compute[225701]: 2026-01-23 10:37:37.779 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:38.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:38.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:40 np0005593295 nova_compute[225701]: 2026-01-23 10:37:40.064 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:40.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:40.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:41 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:42.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:42 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:37:42 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:42 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:42 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:37:42 np0005593295 nova_compute[225701]: 2026-01-23 10:37:42.782 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:44.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:44.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:45 np0005593295 nova_compute[225701]: 2026-01-23 10:37:45.064 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:46.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:46.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:46 np0005593295 podman[248976]: 2026-01-23 10:37:46.654773213 +0000 UTC m=+0.071512882 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:37:46 np0005593295 podman[248975]: 2026-01-23 10:37:46.686422388 +0000 UTC m=+0.110254331 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 05:37:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:47 np0005593295 nova_compute[225701]: 2026-01-23 10:37:47.784 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:48.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:37:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:48.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:37:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:48 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 23 05:37:48 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:48.971528) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:37:48 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 23 05:37:48 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164668971878, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1356, "num_deletes": 257, "total_data_size": 3394062, "memory_usage": 3463504, "flush_reason": "Manual Compaction"}
Jan 23 05:37:48 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 23 05:37:48 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164668991936, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2198714, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39492, "largest_seqno": 40843, "table_properties": {"data_size": 2192833, "index_size": 3208, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12447, "raw_average_key_size": 19, "raw_value_size": 2180973, "raw_average_value_size": 3450, "num_data_blocks": 137, "num_entries": 632, "num_filter_entries": 632, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164555, "oldest_key_time": 1769164555, "file_creation_time": 1769164668, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:37:48 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 20432 microseconds, and 9787 cpu microseconds.
Jan 23 05:37:48 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:48.992052) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2198714 bytes OK
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:48.992100) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.033821) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.033877) EVENT_LOG_v1 {"time_micros": 1769164669033867, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.033906) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3387697, prev total WAL file size 3387697, number of live WAL files 2.
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.035510) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303130' seq:72057594037927935, type:22 .. '6C6F676D0031323633' seq:0, type:0; will stop at (end)
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2147KB)], [75(12MB)]
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669035679, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15313433, "oldest_snapshot_seqno": -1}
Jan 23 05:37:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6941 keys, 15160705 bytes, temperature: kUnknown
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669255653, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15160705, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15114169, "index_size": 28056, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 182779, "raw_average_key_size": 26, "raw_value_size": 14988903, "raw_average_value_size": 2159, "num_data_blocks": 1100, "num_entries": 6941, "num_filter_entries": 6941, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164669, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.256053) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15160705 bytes
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.257422) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 69.6 rd, 68.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 12.5 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(13.9) write-amplify(6.9) OK, records in: 7471, records dropped: 530 output_compression: NoCompression
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.257440) EVENT_LOG_v1 {"time_micros": 1769164669257431, "job": 46, "event": "compaction_finished", "compaction_time_micros": 220140, "compaction_time_cpu_micros": 54695, "output_level": 6, "num_output_files": 1, "total_output_size": 15160705, "num_input_records": 7471, "num_output_records": 6941, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669257990, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669260356, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.034799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260411) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:37:49.260419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:37:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:50 np0005593295 nova_compute[225701]: 2026-01-23 10:37:50.109 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:37:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:37:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:50.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:52.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:37:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:52.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:37:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:52 np0005593295 nova_compute[225701]: 2026-01-23 10:37:52.788 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:54.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:54.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:55 np0005593295 nova_compute[225701]: 2026-01-23 10:37:55.112 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:37:55.509 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:37:55.510 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:37:55.510 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:37:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:56.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:37:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:56.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:56 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:37:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:57 np0005593295 nova_compute[225701]: 2026-01-23 10:37:57.791 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:58.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:37:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:37:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:58.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:37:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:37:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:37:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:37:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:00 np0005593295 nova_compute[225701]: 2026-01-23 10:38:00.115 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:38:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:00.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:38:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:00.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:00 np0005593295 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:38:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:38:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:02.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:38:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:38:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:02.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:38:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:02 np0005593295 nova_compute[225701]: 2026-01-23 10:38:02.795 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:04.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:38:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:04.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:38:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:05 np0005593295 nova_compute[225701]: 2026-01-23 10:38:05.116 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:38:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:06.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:38:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:38:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:06.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:38:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:07 np0005593295 nova_compute[225701]: 2026-01-23 10:38:07.797 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:08.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:08.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:10 np0005593295 nova_compute[225701]: 2026-01-23 10:38:10.120 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:38:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:10.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:38:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:10.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:38:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:12.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:38:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:12.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:12 np0005593295 nova_compute[225701]: 2026-01-23 10:38:12.841 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:38:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:14.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:38:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:14.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:15 np0005593295 nova_compute[225701]: 2026-01-23 10:38:15.122 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:16.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:16.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:16 np0005593295 nova_compute[225701]: 2026-01-23 10:38:16.800 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:17 np0005593295 podman[249126]: 2026-01-23 10:38:17.625226995 +0000 UTC m=+0.045677880 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:38:17 np0005593295 podman[249125]: 2026-01-23 10:38:17.658591712 +0000 UTC m=+0.085772252 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:38:17 np0005593295 nova_compute[225701]: 2026-01-23 10:38:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:17 np0005593295 nova_compute[225701]: 2026-01-23 10:38:17.844 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:18.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:18.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:19 np0005593295 nova_compute[225701]: 2026-01-23 10:38:19.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:19 np0005593295 nova_compute[225701]: 2026-01-23 10:38:19.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:38:19 np0005593295 nova_compute[225701]: 2026-01-23 10:38:19.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:38:19 np0005593295 nova_compute[225701]: 2026-01-23 10:38:19.797 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:38:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:20 np0005593295 nova_compute[225701]: 2026-01-23 10:38:20.124 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:20.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:20.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:38:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:22.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:38:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:22.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:22 np0005593295 nova_compute[225701]: 2026-01-23 10:38:22.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:22 np0005593295 nova_compute[225701]: 2026-01-23 10:38:22.847 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:23 np0005593295 nova_compute[225701]: 2026-01-23 10:38:23.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:23 np0005593295 nova_compute[225701]: 2026-01-23 10:38:23.809 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:23 np0005593295 nova_compute[225701]: 2026-01-23 10:38:23.809 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:23 np0005593295 nova_compute[225701]: 2026-01-23 10:38:23.809 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:23 np0005593295 nova_compute[225701]: 2026-01-23 10:38:23.810 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:38:23 np0005593295 nova_compute[225701]: 2026-01-23 10:38:23.810 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:38:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:24.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:38:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:24.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:38:24 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/15073574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:24 np0005593295 nova_compute[225701]: 2026-01-23 10:38:24.281 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:24 np0005593295 nova_compute[225701]: 2026-01-23 10:38:24.426 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:38:24 np0005593295 nova_compute[225701]: 2026-01-23 10:38:24.427 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4876MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:38:24 np0005593295 nova_compute[225701]: 2026-01-23 10:38:24.428 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:24 np0005593295 nova_compute[225701]: 2026-01-23 10:38:24.428 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.000 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.000 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.013 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing inventories for resource provider db762d15-510c-4120-bfc4-afe76b90b657 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.070 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating ProviderTree inventory for provider db762d15-510c-4120-bfc4-afe76b90b657 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.071 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Updating inventory in ProviderTree for provider db762d15-510c-4120-bfc4-afe76b90b657 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.082 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing aggregate associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:38:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.106 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Refreshing trait associations for resource provider db762d15-510c-4120-bfc4-afe76b90b657, traits: COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_BMI2,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_FDC,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.121 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.139 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:38:25 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3162061129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.580 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.587 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.601 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.604 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:38:25 np0005593295 nova_compute[225701]: 2026-01-23 10:38:25.604 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:26.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:26.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:26 np0005593295 nova_compute[225701]: 2026-01-23 10:38:26.605 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:26 np0005593295 nova_compute[225701]: 2026-01-23 10:38:26.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:27 np0005593295 nova_compute[225701]: 2026-01-23 10:38:27.782 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:27 np0005593295 nova_compute[225701]: 2026-01-23 10:38:27.848 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:28.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:28.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:28 np0005593295 nova_compute[225701]: 2026-01-23 10:38:28.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:28 np0005593295 nova_compute[225701]: 2026-01-23 10:38:28.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:38:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:29 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:30 np0005593295 nova_compute[225701]: 2026-01-23 10:38:30.126 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:30.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:30.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:38:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:32.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:38:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:32.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:32 np0005593295 nova_compute[225701]: 2026-01-23 10:38:32.851 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:34.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:34.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:34 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:35 np0005593295 nova_compute[225701]: 2026-01-23 10:38:35.127 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:36.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:38:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:36.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:38:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:37 np0005593295 nova_compute[225701]: 2026-01-23 10:38:37.854 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:38.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:38.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:39 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:40 np0005593295 nova_compute[225701]: 2026-01-23 10:38:40.129 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:40.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:40.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:42.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:42.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:42 np0005593295 nova_compute[225701]: 2026-01-23 10:38:42.856 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:44.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:44.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:44 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:45 np0005593295 nova_compute[225701]: 2026-01-23 10:38:45.130 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:38:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:46.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:38:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:46.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:47 np0005593295 nova_compute[225701]: 2026-01-23 10:38:47.858 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:48.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:48.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:48 np0005593295 podman[249271]: 2026-01-23 10:38:48.62874471 +0000 UTC m=+0.050727203 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:38:48 np0005593295 podman[249270]: 2026-01-23 10:38:48.654458759 +0000 UTC m=+0.079901927 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:38:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:49 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:50 np0005593295 nova_compute[225701]: 2026-01-23 10:38:50.131 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:50.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:50.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:52.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:52.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:52 np0005593295 nova_compute[225701]: 2026-01-23 10:38:52.861 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:54.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:54.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:55 np0005593295 nova_compute[225701]: 2026-01-23 10:38:55.133 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:38:55.511 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:38:55.512 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:38:55.512 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:56.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:56.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:57 np0005593295 nova_compute[225701]: 2026-01-23 10:38:57.863 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:58.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:38:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:58.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:58 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:38:58 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:38:58 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:38:58 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:38:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:38:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:38:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:38:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:00 np0005593295 nova_compute[225701]: 2026-01-23 10:39:00.135 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:00.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:39:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:00.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:39:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000048s ======
Jan 23 05:39:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:02.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Jan 23 05:39:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:02.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:02 np0005593295 nova_compute[225701]: 2026-01-23 10:39:02.866 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:04.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:04 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:39:04 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:39:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:39:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:04.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:39:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:05 np0005593295 nova_compute[225701]: 2026-01-23 10:39:05.138 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:06.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:06.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:07 np0005593295 nova_compute[225701]: 2026-01-23 10:39:07.868 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:08.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:08.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:10 np0005593295 nova_compute[225701]: 2026-01-23 10:39:10.177 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:10.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:39:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:10.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:39:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:12.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:39:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:12.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:39:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:12 np0005593295 nova_compute[225701]: 2026-01-23 10:39:12.870 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:14.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:14.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:14 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:15 np0005593295 nova_compute[225701]: 2026-01-23 10:39:15.180 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:15 np0005593295 nova_compute[225701]: 2026-01-23 10:39:15.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:16.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:16.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:17 np0005593295 nova_compute[225701]: 2026-01-23 10:39:17.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:17 np0005593295 nova_compute[225701]: 2026-01-23 10:39:17.872 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:18.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:18.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:19 np0005593295 podman[249503]: 2026-01-23 10:39:19.63767649 +0000 UTC m=+0.060130154 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 05:39:19 np0005593295 podman[249502]: 2026-01-23 10:39:19.699490734 +0000 UTC m=+0.123477105 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:39:19 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:20 np0005593295 nova_compute[225701]: 2026-01-23 10:39:20.229 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:20.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:20.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:21 np0005593295 nova_compute[225701]: 2026-01-23 10:39:21.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:21 np0005593295 nova_compute[225701]: 2026-01-23 10:39:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:39:21 np0005593295 nova_compute[225701]: 2026-01-23 10:39:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:39:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:22.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:39:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:22.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:39:22 np0005593295 nova_compute[225701]: 2026-01-23 10:39:22.343 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:39:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:22 np0005593295 nova_compute[225701]: 2026-01-23 10:39:22.874 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:23 np0005593295 nova_compute[225701]: 2026-01-23 10:39:23.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:23 np0005593295 nova_compute[225701]: 2026-01-23 10:39:23.818 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:23 np0005593295 nova_compute[225701]: 2026-01-23 10:39:23.819 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:23 np0005593295 nova_compute[225701]: 2026-01-23 10:39:23.819 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:23 np0005593295 nova_compute[225701]: 2026-01-23 10:39:23.819 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:39:23 np0005593295 nova_compute[225701]: 2026-01-23 10:39:23.819 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:24.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:24.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:39:24 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3838063102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:24 np0005593295 nova_compute[225701]: 2026-01-23 10:39:24.347 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:24 np0005593295 nova_compute[225701]: 2026-01-23 10:39:24.497 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:39:24 np0005593295 nova_compute[225701]: 2026-01-23 10:39:24.499 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4862MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:39:24 np0005593295 nova_compute[225701]: 2026-01-23 10:39:24.499 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:24 np0005593295 nova_compute[225701]: 2026-01-23 10:39:24.499 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:24 np0005593295 nova_compute[225701]: 2026-01-23 10:39:24.577 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:39:24 np0005593295 nova_compute[225701]: 2026-01-23 10:39:24.578 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:39:24 np0005593295 nova_compute[225701]: 2026-01-23 10:39:24.600 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:24 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:25 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:39:25 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/712866539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:25 np0005593295 nova_compute[225701]: 2026-01-23 10:39:25.046 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:25 np0005593295 nova_compute[225701]: 2026-01-23 10:39:25.052 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:39:25 np0005593295 nova_compute[225701]: 2026-01-23 10:39:25.067 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:39:25 np0005593295 nova_compute[225701]: 2026-01-23 10:39:25.068 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:39:25 np0005593295 nova_compute[225701]: 2026-01-23 10:39:25.068 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:25 np0005593295 nova_compute[225701]: 2026-01-23 10:39:25.230 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:26 np0005593295 nova_compute[225701]: 2026-01-23 10:39:26.069 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:26.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:26.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:26 np0005593295 nova_compute[225701]: 2026-01-23 10:39:26.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:27 np0005593295 nova_compute[225701]: 2026-01-23 10:39:27.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:27 np0005593295 nova_compute[225701]: 2026-01-23 10:39:27.785 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:27 np0005593295 nova_compute[225701]: 2026-01-23 10:39:27.876 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:39:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:28.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:39:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:28.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:28 np0005593295 nova_compute[225701]: 2026-01-23 10:39:28.779 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:30 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:30 np0005593295 nova_compute[225701]: 2026-01-23 10:39:30.232 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:30.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:30.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:30 np0005593295 nova_compute[225701]: 2026-01-23 10:39:30.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:30 np0005593295 nova_compute[225701]: 2026-01-23 10:39:30.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:39:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:32.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:39:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:32.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:39:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:32 np0005593295 nova_compute[225701]: 2026-01-23 10:39:32.879 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:34.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:34.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:35 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:35 np0005593295 nova_compute[225701]: 2026-01-23 10:39:35.235 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:36.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:36.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:37 np0005593295 nova_compute[225701]: 2026-01-23 10:39:37.881 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.029794) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778029856, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1276, "num_deletes": 251, "total_data_size": 3247013, "memory_usage": 3285208, "flush_reason": "Manual Compaction"}
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778070254, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 2102623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40848, "largest_seqno": 42119, "table_properties": {"data_size": 2096978, "index_size": 3039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11955, "raw_average_key_size": 19, "raw_value_size": 2085740, "raw_average_value_size": 3482, "num_data_blocks": 131, "num_entries": 599, "num_filter_entries": 599, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164669, "oldest_key_time": 1769164669, "file_creation_time": 1769164778, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 40516 microseconds, and 6160 cpu microseconds.
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.070314) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 2102623 bytes OK
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.070334) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.073915) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.073967) EVENT_LOG_v1 {"time_micros": 1769164778073957, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.073992) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3240998, prev total WAL file size 3240998, number of live WAL files 2.
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.075091) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(2053KB)], [78(14MB)]
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778075228, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17263328, "oldest_snapshot_seqno": -1}
Jan 23 05:39:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 7024 keys, 14949350 bytes, temperature: kUnknown
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778202032, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 14949350, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14902767, "index_size": 27911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17605, "raw_key_size": 185222, "raw_average_key_size": 26, "raw_value_size": 14776455, "raw_average_value_size": 2103, "num_data_blocks": 1086, "num_entries": 7024, "num_filter_entries": 7024, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164778, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.202277) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 14949350 bytes
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.207948) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.1 rd, 117.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 14.5 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(15.3) write-amplify(7.1) OK, records in: 7540, records dropped: 516 output_compression: NoCompression
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.207976) EVENT_LOG_v1 {"time_micros": 1769164778207961, "job": 48, "event": "compaction_finished", "compaction_time_micros": 126872, "compaction_time_cpu_micros": 34669, "output_level": 6, "num_output_files": 1, "total_output_size": 14949350, "num_input_records": 7540, "num_output_records": 7024, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778208446, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164778211129, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.075011) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211259) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:38 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:39:38.211279) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:38.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:38.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:39:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:40.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:39:40 np0005593295 nova_compute[225701]: 2026-01-23 10:39:40.268 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:40.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:40 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:42.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:42.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:42 np0005593295 nova_compute[225701]: 2026-01-23 10:39:42.884 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:44.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:44.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:45 np0005593295 nova_compute[225701]: 2026-01-23 10:39:45.271 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:45 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:46.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.002000047s ======
Jan 23 05:39:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:46.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000047s
Jan 23 05:39:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:47 np0005593295 nova_compute[225701]: 2026-01-23 10:39:47.886 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:48.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:48.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:50 np0005593295 nova_compute[225701]: 2026-01-23 10:39:50.272 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:50.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:50.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:50 np0005593295 podman[249650]: 2026-01-23 10:39:50.631748514 +0000 UTC m=+0.056391172 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:39:50 np0005593295 podman[249649]: 2026-01-23 10:39:50.655494825 +0000 UTC m=+0.080734767 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 23 05:39:50 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:39:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:52.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:39:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:39:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:52.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:39:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:52 np0005593295 nova_compute[225701]: 2026-01-23 10:39:52.888 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:54.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:54.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:55 np0005593295 nova_compute[225701]: 2026-01-23 10:39:55.273 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:39:55.511 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:39:55.512 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:39:55.512 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:56.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:56.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:57 np0005593295 nova_compute[225701]: 2026-01-23 10:39:57.890 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:39:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:58.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:39:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:39:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:58.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:39:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:39:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:39:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:00 np0005593295 nova_compute[225701]: 2026-01-23 10:40:00.275 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:00.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:00.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:00 np0005593295 ceph-mon[75771]: overall HEALTH_WARN 2 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Jan 23 05:40:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:02.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:02.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:02 np0005593295 nova_compute[225701]: 2026-01-23 10:40:02.893 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:40:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:04.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:40:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:04.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:05 np0005593295 nova_compute[225701]: 2026-01-23 10:40:05.277 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:06 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:06 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:06 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:40:06 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:06.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:06.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:07 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:07 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:40:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:07 np0005593295 nova_compute[225701]: 2026-01-23 10:40:07.895 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:08.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:08.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:10 np0005593295 nova_compute[225701]: 2026-01-23 10:40:10.279 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:10.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:10.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:10 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:11 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:11 np0005593295 ceph-mon[75771]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 05:40:11 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:11 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:12.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:12 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:12 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:12 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:12.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:12 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:12 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:12 np0005593295 nova_compute[225701]: 2026-01-23 10:40:12.898 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:13 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:13 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:14.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:14 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:14 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:14 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:14.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:14 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:14 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:15 np0005593295 nova_compute[225701]: 2026-01-23 10:40:15.281 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:15 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:15 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:15 np0005593295 nova_compute[225701]: 2026-01-23 10:40:15.778 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:15 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:16.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:16 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:16 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:16 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:16.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:16 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:16 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:17 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:17 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:17 np0005593295 nova_compute[225701]: 2026-01-23 10:40:17.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:17 np0005593295 nova_compute[225701]: 2026-01-23 10:40:17.901 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:18.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:18 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:18 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:18 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:18 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:40:18 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:18.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:40:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:19 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:19 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:20 np0005593295 nova_compute[225701]: 2026-01-23 10:40:20.283 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:20.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:20 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:20 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:20 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:20 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:20 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:20.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:21 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:21 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:21 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:21 np0005593295 podman[249878]: 2026-01-23 10:40:21.633500726 +0000 UTC m=+0.054216359 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 05:40:21 np0005593295 podman[249877]: 2026-01-23 10:40:21.71755971 +0000 UTC m=+0.139012601 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:40:21 np0005593295 nova_compute[225701]: 2026-01-23 10:40:21.784 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:21 np0005593295 nova_compute[225701]: 2026-01-23 10:40:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:40:21 np0005593295 nova_compute[225701]: 2026-01-23 10:40:21.784 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:40:21 np0005593295 nova_compute[225701]: 2026-01-23 10:40:21.799 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:40:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:22.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:22 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:22 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:22 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:22 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:22 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:22.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:22 np0005593295 nova_compute[225701]: 2026-01-23 10:40:22.904 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:23 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:23 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:24.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:24 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:24 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:24 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:24 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:24 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:24.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.162027) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825162289, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 709, "num_deletes": 251, "total_data_size": 1512212, "memory_usage": 1528376, "flush_reason": "Manual Compaction"}
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825171485, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 689452, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42124, "largest_seqno": 42828, "table_properties": {"data_size": 686359, "index_size": 1001, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8261, "raw_average_key_size": 20, "raw_value_size": 679885, "raw_average_value_size": 1712, "num_data_blocks": 43, "num_entries": 397, "num_filter_entries": 397, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164778, "oldest_key_time": 1769164778, "file_creation_time": 1769164825, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 9446 microseconds, and 3964 cpu microseconds.
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.171548) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 689452 bytes OK
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.171579) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.174743) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.174763) EVENT_LOG_v1 {"time_micros": 1769164825174759, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.174778) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1508402, prev total WAL file size 1508402, number of live WAL files 2.
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.175543) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323535' seq:72057594037927935, type:22 .. '6D6772737461740031353037' seq:0, type:0; will stop at (end)
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(673KB)], [81(14MB)]
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825175713, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 15638802, "oldest_snapshot_seqno": -1}
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6922 keys, 11743274 bytes, temperature: kUnknown
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825257855, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 11743274, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11701864, "index_size": 22994, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 183305, "raw_average_key_size": 26, "raw_value_size": 11581679, "raw_average_value_size": 1673, "num_data_blocks": 892, "num_entries": 6922, "num_filter_entries": 6922, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 0, "file_creation_time": 1769164825, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "dbf9ba81-81fe-4d1e-9307-233133587890", "db_session_id": "17IZ7DW7X4LNV3P33NJD", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.258330) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 11743274 bytes
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.260057) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.0 rd, 142.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 14.3 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(39.7) write-amplify(17.0) OK, records in: 7421, records dropped: 499 output_compression: NoCompression
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.260078) EVENT_LOG_v1 {"time_micros": 1769164825260069, "job": 50, "event": "compaction_finished", "compaction_time_micros": 82296, "compaction_time_cpu_micros": 36610, "output_level": 6, "num_output_files": 1, "total_output_size": 11743274, "num_input_records": 7421, "num_output_records": 6922, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825260627, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825263747, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.175365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263818) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263826) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593295 ceph-mon[75771]: rocksdb: (Original Log Time 2026/01/23-10:40:25.263827) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:25 np0005593295 nova_compute[225701]: 2026-01-23 10:40:25.285 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:25 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:25 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:25 np0005593295 nova_compute[225701]: 2026-01-23 10:40:25.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:25 np0005593295 nova_compute[225701]: 2026-01-23 10:40:25.806 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:25 np0005593295 nova_compute[225701]: 2026-01-23 10:40:25.806 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:25 np0005593295 nova_compute[225701]: 2026-01-23 10:40:25.806 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:25 np0005593295 nova_compute[225701]: 2026-01-23 10:40:25.806 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:40:25 np0005593295 nova_compute[225701]: 2026-01-23 10:40:25.807 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:26 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:40:26 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1373761955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:26 np0005593295 nova_compute[225701]: 2026-01-23 10:40:26.277 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:26.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:26 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:26 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:26 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:26 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:26 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:26.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:26 np0005593295 nova_compute[225701]: 2026-01-23 10:40:26.419 225706 WARNING nova.virt.libvirt.driver [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:40:26 np0005593295 nova_compute[225701]: 2026-01-23 10:40:26.420 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4834MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:40:26 np0005593295 nova_compute[225701]: 2026-01-23 10:40:26.421 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:26 np0005593295 nova_compute[225701]: 2026-01-23 10:40:26.421 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:26 np0005593295 nova_compute[225701]: 2026-01-23 10:40:26.486 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:40:26 np0005593295 nova_compute[225701]: 2026-01-23 10:40:26.487 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:40:26 np0005593295 nova_compute[225701]: 2026-01-23 10:40:26.503 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:27 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 05:40:27 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1342687352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:27 np0005593295 nova_compute[225701]: 2026-01-23 10:40:27.182 225706 DEBUG oslo_concurrency.processutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.679s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:27 np0005593295 nova_compute[225701]: 2026-01-23 10:40:27.187 225706 DEBUG nova.compute.provider_tree [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed in ProviderTree for provider: db762d15-510c-4120-bfc4-afe76b90b657 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:40:27 np0005593295 nova_compute[225701]: 2026-01-23 10:40:27.205 225706 DEBUG nova.scheduler.client.report [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Inventory has not changed for provider db762d15-510c-4120-bfc4-afe76b90b657 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:40:27 np0005593295 nova_compute[225701]: 2026-01-23 10:40:27.206 225706 DEBUG nova.compute.resource_tracker [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:40:27 np0005593295 nova_compute[225701]: 2026-01-23 10:40:27.206 225706 DEBUG oslo_concurrency.lockutils [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:27 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:27 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:27 np0005593295 nova_compute[225701]: 2026-01-23 10:40:27.907 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:28 np0005593295 nova_compute[225701]: 2026-01-23 10:40:28.207 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:28 np0005593295 nova_compute[225701]: 2026-01-23 10:40:28.208 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:28 np0005593295 nova_compute[225701]: 2026-01-23 10:40:28.208 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:28 np0005593295 nova_compute[225701]: 2026-01-23 10:40:28.208 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:28.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:28 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:28 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:28 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:28 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:40:28 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:28.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:40:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:29 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:29 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:30 np0005593295 nova_compute[225701]: 2026-01-23 10:40:30.287 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:30.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:30 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:30 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:30 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:30 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:30 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:30.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:31 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:31 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:31 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:31 np0005593295 nova_compute[225701]: 2026-01-23 10:40:31.783 225706 DEBUG oslo_service.periodic_task [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:31 np0005593295 nova_compute[225701]: 2026-01-23 10:40:31.783 225706 DEBUG nova.compute.manager [None req-1f3e60bb-a440-493f-8881-50c40bb67c28 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:40:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:32.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:32 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:32 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:32 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:32 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:32 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:32.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:32 np0005593295 nova_compute[225701]: 2026-01-23 10:40:32.909 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:33 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:33 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:34.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:34 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:34 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:34 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:34 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:34 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:34.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:35 np0005593295 nova_compute[225701]: 2026-01-23 10:40:35.289 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:35 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:35 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:36 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:40:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:36.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:40:36 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:36 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:36 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:36 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:40:36 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:36.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:40:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:37 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:37 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:37 np0005593295 nova_compute[225701]: 2026-01-23 10:40:37.912 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:38.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:38 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:38 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:38 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:38 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:38 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:38.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:39 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:39 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:40:40 np0005593295 ceph-mon[75771]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8117 writes, 42K keys, 8117 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 8117 writes, 8117 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1522 writes, 8008 keys, 1522 commit groups, 1.0 writes per commit group, ingest: 17.39 MB, 0.03 MB/s#012Interval WAL: 1523 writes, 1523 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     26.7      2.26              0.24        25    0.091       0      0       0.0       0.0#012  L6      1/0   11.20 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   5.0     93.8     80.6      3.75              1.13        24    0.156    146K    13K       0.0       0.0#012 Sum      1/0   11.20 MB   0.0      0.3     0.1      0.3       0.4      0.1       0.0   6.0     58.5     60.4      6.02              1.37        49    0.123    146K    13K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.7    108.3    105.5      0.96              0.29        14    0.069     51K   4033       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0     93.8     80.6      3.75              1.13        24    0.156    146K    13K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     26.8      2.26              0.24        24    0.094       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.059, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.35 GB write, 0.12 MB/s write, 0.34 GB read, 0.12 MB/s read, 6.0 seconds#012Interval compaction: 0.10 GB write, 0.17 MB/s write, 0.10 GB read, 0.17 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55c6513709b0#2 capacity: 304.00 MB usage: 31.80 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000212 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1897,30.72 MB,10.1039%) FilterBlock(49,427.73 KB,0.137404%) IndexBlock(49,682.17 KB,0.219139%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:40:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:40 np0005593295 nova_compute[225701]: 2026-01-23 10:40:40.292 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:40.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:40 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:40 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:40 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:40 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:40 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:40.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:41 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:41 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:41 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:42.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:42 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:42 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:42 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:42 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:42 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:42.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:42 np0005593295 nova_compute[225701]: 2026-01-23 10:40:42.915 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:43 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:43 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:43 np0005593295 systemd-logind[786]: New session 58 of user zuul.
Jan 23 05:40:43 np0005593295 systemd[1]: Started Session 58 of User zuul.
Jan 23 05:40:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:44.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:44 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:44 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:44 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:44 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:44 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:44.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:45 np0005593295 nova_compute[225701]: 2026-01-23 10:40:45.295 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:45 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:45 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:46 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:46.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:46 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:46 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:46 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:46 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:46 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:46.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:47 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 05:40:47 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2902829741' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 05:40:47 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:47 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:47 np0005593295 nova_compute[225701]: 2026-01-23 10:40:47.918 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:48.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:48 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:48 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:48 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:48 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:48 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:48.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:49 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:49 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:50 np0005593295 nova_compute[225701]: 2026-01-23 10:40:50.295 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:50.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:50 np0005593295 ovs-vsctl[250348]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 05:40:50 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:50 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:50 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:50 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:50 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:50.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:51 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:51 np0005593295 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 05:40:51 np0005593295 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 05:40:51 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:51 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:51 np0005593295 virtqemud[225221]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 05:40:51 np0005593295 podman[250604]: 2026-01-23 10:40:51.781169538 +0000 UTC m=+0.060471851 container health_status bf11b28fa2f2f1ab2ed32cfa0ed3939bc832abd0643b236a5cd65dd051ea17d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:40:51 np0005593295 podman[250620]: 2026-01-23 10:40:51.83064516 +0000 UTC m=+0.080371924 container health_status 7389fb5f1793c621771ce3311225561fce3d7bf9f3fb7bc6e9067c832b5a9087 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61-6a3b026d9f30a263144a7e55eb3177db18f084312517b5cf670c5228aa4faa61'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:40:51 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: cache status {prefix=cache status} (starting...)
Jan 23 05:40:51 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: client ls {prefix=client ls} (starting...)
Jan 23 05:40:52 np0005593295 lvm[250768]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 05:40:52 np0005593295 lvm[250768]: VG ceph_vg0 finished
Jan 23 05:40:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:40:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:52.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:40:52 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:52 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:52 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:52 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:52 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:52.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:52 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 05:40:52 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 05:40:52 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 05:40:52 np0005593295 nova_compute[225701]: 2026-01-23 10:40:52.920 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 23 05:40:52 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2735878044' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 05:40:53 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 05:40:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:53 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 05:40:53 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 05:40:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 23 05:40:53 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2326781007' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 05:40:53 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:53 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:53 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 05:40:53 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 05:40:53 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 23 05:40:53 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4283374481' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 05:40:54 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: ops {prefix=ops} (starting...)
Jan 23 05:40:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 23 05:40:54 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1170917103' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 05:40:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:54.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:54 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:54 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:54 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:54 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:54 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:54.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 23 05:40:54 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1278040154' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 05:40:54 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:40:54 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1520575926' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:40:54 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: session ls {prefix=session ls} (starting...)
Jan 23 05:40:54 np0005593295 ceph-mds[83039]: mds.cephfs.compute-2.prgzmm asok_command: status {prefix=status} (starting...)
Jan 23 05:40:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 05:40:55 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/431388215' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 05:40:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:55 np0005593295 nova_compute[225701]: 2026-01-23 10:40:55.321 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:55 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:55 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:40:55.513 142606 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:40:55.513 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:55 np0005593295 ovn_metadata_agent[142601]: 2026-01-23 10:40:55.513 142606 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 05:40:55 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2891389420' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 05:40:55 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 23 05:40:55 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2152170927' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 05:40:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 05:40:56 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/858187861' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 05:40:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 23 05:40:56 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4002027261' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 05:40:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:56.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:56 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:56 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:56 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:56 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:56 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:56.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:56 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 23 05:40:56 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/676404484' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 05:40:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 05:40:57 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3056118106' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 05:40:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 23 05:40:57 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/529095491' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 05:40:57 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:57 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:57 np0005593295 nova_compute[225701]: 2026-01-23 10:40:57.922 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:57 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 23 05:40:57 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3096053969' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836100 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 63.238079071s of 63.242374420s, submitted: 1
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837612 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261ec800 session 0x559226fee960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837021 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 110.972518921s of 111.781974792s, submitted: 2
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5807 writes, 24K keys, 5807 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5807 writes, 987 syncs, 5.88 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 440 writes, 717 keys, 440 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 440 writes, 204 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559222cb1350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 991232 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x5592261f1400 session 0x559226fefc20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 93.923355103s of 93.927070618s, submitted: 1
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74350592 unmapped: 1105920 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836430 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1122304 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1097728 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1089536 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 1073152 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838014 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75448320 unmapped: 1056768 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 1048576 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 1048576 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.652290344s of 11.764292717s, submitted: 230
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840966 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75456512 unmapped: 2097152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75464704 unmapped: 2088960 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75472896 unmapped: 2080768 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 ms_handle_reset con 0x559226afa000 session 0x5592254723c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75481088 unmapped: 2072576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 2064384 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840375 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.259738922s of 41.266269684s, submitted: 3
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 841887 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 841887 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 2056192 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 2039808 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 2023424 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 2007040 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1990656 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1982464 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1974272 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840705 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 heartbeat osd_stat(store_statfs(0x4fca7f000/0x0/0x4ffc00000, data 0xeb6fe/0x19d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1957888 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 130.189498901s of 130.569747925s, submitted: 3
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 75636736 unmapped: 1916928 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846191 data_alloc: 218103808 data_used: 40960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fca7a000/0x0/0x4ffc00000, data 0xed7f2/0x1a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 76718080 unmapped: 835584 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 16392192 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 140 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 141 ms_handle_reset con 0x559226af8800 session 0x559227226780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fba75000/0x0/0x4ffc00000, data 0x10ef970/0x11a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 16359424 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fba75000/0x0/0x4ffc00000, data 0x10ef970/0x11a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 16236544 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 142 ms_handle_reset con 0x559224eeb000 session 0x559227226d20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967505 data_alloc: 218103808 data_used: 45056
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 16211968 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6d000/0x0/0x4ffc00000, data 0x10f3bc6/0x11ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226b04c00 session 0x559226feeb40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226636c00 session 0x55922721e780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969935 data_alloc: 218103808 data_used: 45056
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.924980164s of 34.470951080s, submitted: 51
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971447 data_alloc: 218103808 data_used: 45056
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226728800 session 0x55922677e960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 16187392 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fba6a000/0x0/0x4ffc00000, data 0x10f5b98/0x11b1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971447 data_alloc: 218103808 data_used: 45056
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226af8400 session 0x559227227e00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x5592261e8400 session 0x55922723c5a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226634000 session 0x55922723c780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 16171008 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 16171008 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226afcc00 session 0x55922723c960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226afcc00 session 0x55922723cd20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 92913664 unmapped: 1425408 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 ms_handle_reset con 0x559226af6400 session 0x55922723cf00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 92938240 unmapped: 1400832 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.170117378s of 10.184672356s, submitted: 3
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fba67000/0x0/0x4ffc00000, data 0x10f7c84/0x11b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,7])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1000 session 0x55922723d0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93839360 unmapped: 17342464 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123797 data_alloc: 234881024 data_used: 13676544
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1800 session 0x55922723da40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93863936 unmapped: 17317888 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93855744 unmapped: 17326080 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e7c00 session 0x559226f24f00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93937664 unmapped: 17244160 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93937664 unmapped: 17244160 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1000 session 0x559226f24960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fac33000/0x0/0x4ffc00000, data 0x1f29dc4/0x1fe7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 145 ms_handle_reset con 0x5592261e1800 session 0x55922657f860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93700096 unmapped: 17481728 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1128704 data_alloc: 234881024 data_used: 13676544
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 93724672 unmapped: 17457152 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 99999744 unmapped: 11182080 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105676800 unmapped: 5505024 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105676800 unmapped: 5505024 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.482107162s of 10.092863083s, submitted: 62
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1227418 data_alloc: 234881024 data_used: 25862144
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fac0c000/0x0/0x4ffc00000, data 0x1f4fda5/0x200f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225987 data_alloc: 234881024 data_used: 25862144
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 105709568 unmapped: 5472256 heap: 111181824 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109395968 unmapped: 3883008 heap: 113278976 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa78e000/0x0/0x4ffc00000, data 0x23ceda5/0x248e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108773376 unmapped: 9756672 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.997505188s of 10.222607613s, submitted: 78
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9eb8000/0x0/0x4ffc00000, data 0x2ca4da5/0x2d64000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111665152 unmapped: 6864896 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1356813 data_alloc: 251658240 data_used: 27123712
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 6561792 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c6c000/0x0/0x4ffc00000, data 0x2d4fda5/0x2e0f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112009216 unmapped: 6520832 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1357269 data_alloc: 251658240 data_used: 27136000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 6373376 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c49000/0x0/0x4ffc00000, data 0x2d73da5/0x2e33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112156672 unmapped: 6373376 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112173056 unmapped: 6356992 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112173056 unmapped: 6356992 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c49000/0x0/0x4ffc00000, data 0x2d73da5/0x2e33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354717 data_alloc: 251658240 data_used: 27205632
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 6348800 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.461258888s of 13.723365784s, submitted: 31
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112353280 unmapped: 6176768 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c40000/0x0/0x4ffc00000, data 0x2d7cda5/0x2e3c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 6553600 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x55922723d4a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af7c00 session 0x55922723cb40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e6000 session 0x55922723dc20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 111984640 unmapped: 6545408 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1354877 data_alloc: 251658240 data_used: 27205632
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1000 session 0x5592254712c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114376704 unmapped: 4153344 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1800 session 0x559225471a40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c3f000/0x0/0x4ffc00000, data 0x2d7dda5/0x2e3d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114376704 unmapped: 4153344 heap: 118530048 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x559225624f00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.490158081s of 14.116064072s, submitted: 3
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af7c00 session 0x559223b87e00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115531776 unmapped: 13500416 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115531776 unmapped: 13500416 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b6000/0x0/0x4ffc00000, data 0x3506da5/0x35c6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411121 data_alloc: 251658240 data_used: 29302784
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 13467648 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 13434880 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411257 data_alloc: 251658240 data_used: 29302784
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e2000 session 0x55922546f0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1411409 data_alloc: 251658240 data_used: 29306880
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa400 session 0x5592247d8960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b4000/0x0/0x4ffc00000, data 0x3507da5/0x35c7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922721fa40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.360092163s of 15.706788063s, submitted: 14
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f3400 session 0x55922721fc20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 13426688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115613696 unmapped: 13418496 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120020992 unmapped: 9011200 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1456479 data_alloc: 251658240 data_used: 33521664
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 8978432 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b3000/0x0/0x4ffc00000, data 0x3507db5/0x35c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 8945664 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 8945664 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b3000/0x0/0x4ffc00000, data 0x3507db5/0x35c8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1456615 data_alloc: 251658240 data_used: 33521664
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b1000/0x0/0x4ffc00000, data 0x350adb5/0x35cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120119296 unmapped: 8912896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f84b1000/0x0/0x4ffc00000, data 0x350adb5/0x35cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 9330688 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.960206985s of 12.248162270s, submitted: 5
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1457327 data_alloc: 251658240 data_used: 33529856
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119889920 unmapped: 9142272 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120283136 unmapped: 8749056 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 7888896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 7888896 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f2000/0x0/0x4ffc00000, data 0x38bbdb5/0x397c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1500921 data_alloc: 251658240 data_used: 33931264
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 7856128 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120578048 unmapped: 8454144 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549ec00 session 0x55922669fe00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afd400 session 0x55922721e960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f80f8000/0x0/0x4ffc00000, data 0x38c3db5/0x3984000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1495881 data_alloc: 251658240 data_used: 33931264
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.955549240s of 10.352662086s, submitted: 63
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922723c960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c2f000/0x0/0x4ffc00000, data 0x2d8dda5/0x2e4d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356198 data_alloc: 234881024 data_used: 25632768
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c2f000/0x0/0x4ffc00000, data 0x2d8dda5/0x2e4d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 13090816 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356198 data_alloc: 234881024 data_used: 25632768
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af6400 session 0x5592267a63c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.853686333s of 10.125116348s, submitted: 14
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afcc00 session 0x5592272292c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 20922368 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2800 session 0x55922723d4a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08000 session 0x559227227a40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051233 data_alloc: 234881024 data_used: 15777792
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109240320 unmapped: 19791872 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.126308441s of 17.207635880s, submitted: 34
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1054257 data_alloc: 234881024 data_used: 15777792
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109289472 unmapped: 19742720 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109297664 unmapped: 19734528 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e9800 session 0x559226feed20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108732416 unmapped: 20299776 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1053666 data_alloc: 234881024 data_used: 14729216
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x559226784d20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108339200 unmapped: 20692992 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c1000/0x0/0x4ffc00000, data 0x10fbdbf/0x11bb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108339200 unmapped: 20692992 heap: 129032192 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb000 session 0x55922721eb40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00000 session 0x55922669f0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afbc00 session 0x5592254eeb40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108707840 unmapped: 24526848 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e9800 session 0x55922721fa40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1132656 data_alloc: 234881024 data_used: 14729216
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x55922721fc20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.649352074s of 13.054588318s, submitted: 40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 108716032 unmapped: 24518656 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9f02000/0x0/0x4ffc00000, data 0x1abadf8/0x1b7a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb000 session 0x55922721e960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 24207360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 24207360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194396 data_alloc: 234881024 data_used: 20054016
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194396 data_alloc: 234881024 data_used: 20054016
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 22962176 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x1adedf8/0x1b9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 22953984 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 22953984 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.911570549s of 12.916566849s, submitted: 2
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb400 session 0x559224554960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114982912 unmapped: 18251776 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115589120 unmapped: 17645568 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292618 data_alloc: 234881024 data_used: 21278720
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b6000/0x0/0x4ffc00000, data 0x24fddf8/0x25bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 17416192 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113991680 unmapped: 19243008 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113991680 unmapped: 19243008 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301298 data_alloc: 234881024 data_used: 21491712
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301618 data_alloc: 234881024 data_used: 21499904
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.075481415s of 14.293769836s, submitted: 87
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301498 data_alloc: 234881024 data_used: 21504000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113999872 unmapped: 19234816 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300148 data_alloc: 234881024 data_used: 21504000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 20086784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301820 data_alloc: 234881024 data_used: 21557248
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.004294395s of 14.015699387s, submitted: 3
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00000 session 0x55922721ef00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f94b1000/0x0/0x4ffc00000, data 0x250bdf8/0x25cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 20004864 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634800 session 0x5592267a63c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 19996672 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 19996672 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa16e000/0x0/0x4ffc00000, data 0x112cdf8/0x11ec000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1076098 data_alloc: 234881024 data_used: 10645504
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x55922723c780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1070046 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa582000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 26222592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa800 session 0x559226ffa960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676ec00 session 0x559226ffa780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af6400 session 0x559226ffab40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226ffb680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.319431305s of 30.379514694s, submitted: 28
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 26157056 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634800 session 0x559226ffa5a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e000 session 0x55922721f4a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06c00 session 0x55922721e5a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e4400 session 0x55922721e960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226ffb860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117062 data_alloc: 234881024 data_used: 10539008
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3ce000/0x0/0x4ffc00000, data 0x15ede08/0x16ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3ce000/0x0/0x4ffc00000, data 0x15ede08/0x16ae000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b01c00 session 0x5592247d9860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592267850e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 26214400 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc400 session 0x559226784780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 26206208 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117924 data_alloc: 234881024 data_used: 10539008
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06400 session 0x559226784960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107036672 unmapped: 26198016 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152408 data_alloc: 234881024 data_used: 15626240
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152408 data_alloc: 234881024 data_used: 15626240
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.764934540s of 19.582212448s, submitted: 45
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa3cd000/0x0/0x4ffc00000, data 0x15ede18/0x16af000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152956 data_alloc: 234881024 data_used: 15638528
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 25739264 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_commit, latency = 5.380156040s
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_sync, latency = 5.380156517s
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.380500793s, txc = 0x559226356c00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114221056 unmapped: 19013632 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113123328 unmapped: 20111360 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f963c000/0x0/0x4ffc00000, data 0x237ee18/0x2440000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,11])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 19062784 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114221056 unmapped: 19013632 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f963c000/0x0/0x4ffc00000, data 0x237ee18/0x2440000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253386 data_alloc: 234881024 data_used: 16334848
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f960a000/0x0/0x4ffc00000, data 0x23b0e18/0x2472000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f960a000/0x0/0x4ffc00000, data 0x23b0e18/0x2472000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1263652 data_alloc: 234881024 data_used: 16482304
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 20078592 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.865746021s of 15.085161209s, submitted: 113
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112640000 unmapped: 20594688 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9607000/0x0/0x4ffc00000, data 0x23b3e18/0x2475000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112648192 unmapped: 20586496 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261276 data_alloc: 234881024 data_used: 16486400
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 20578304 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261500 data_alloc: 234881024 data_used: 16486400
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112664576 unmapped: 20570112 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc000 session 0x559224ef1a40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9800 session 0x559224ef03c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x55922721f680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x55922721f4a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112664576 unmapped: 20570112 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.364326477s of 10.769536018s, submitted: 4
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9605000/0x0/0x4ffc00000, data 0x23b4e28/0x2477000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112672768 unmapped: 20561920 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112672768 unmapped: 20561920 heap: 133234688 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9c00 session 0x559227226960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f2400 session 0x559226ffa1e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af9800 session 0x55922723d680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112902144 unmapped: 23486464 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1329277 data_alloc: 234881024 data_used: 16490496
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afc000 session 0x5592265523c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x55922669e1e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922483b000 session 0x559226785c20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e38000/0x0/0x4ffc00000, data 0x2b81e28/0x2c44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afdc00 session 0x5592247d8d20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112574464 unmapped: 23814144 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592270fc5a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x559226fef680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e38000/0x0/0x4ffc00000, data 0x2b81e28/0x2c44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 112599040 unmapped: 23789568 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113762304 unmapped: 22626304 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352731 data_alloc: 234881024 data_used: 19812352
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119595008 unmapped: 16793600 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e37000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119627776 unmapped: 16760832 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383891 data_alloc: 234881024 data_used: 24457216
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119627776 unmapped: 16760832 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e37000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.307135582s of 14.417451859s, submitted: 28
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 16736256 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1384819 data_alloc: 234881024 data_used: 24469504
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8e36000/0x0/0x4ffc00000, data 0x2b81e38/0x2c45000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121552896 unmapped: 14835712 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121905152 unmapped: 14483456 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123551744 unmapped: 12836864 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123551744 unmapped: 12836864 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8723000/0x0/0x4ffc00000, data 0x3295e38/0x3359000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442267 data_alloc: 234881024 data_used: 24694784
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123592704 unmapped: 12795904 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.862577438s of 11.139899254s, submitted: 73
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123625472 unmapped: 12763136 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123658240 unmapped: 12730368 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123658240 unmapped: 12730368 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440939 data_alloc: 234881024 data_used: 24694784
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e3800 session 0x5592272270e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b03800 session 0x559226ffab40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 12722176 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8704000/0x0/0x4ffc00000, data 0x32b4e38/0x3378000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,4])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592255661e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273528 data_alloc: 234881024 data_used: 16486400
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226f252c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592247d94a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117874688 unmapped: 18513920 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x23b4e18/0x2476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b01400 session 0x5592263fe000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097099 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.934545517s of 17.322147369s, submitted: 73
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098611 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1098611 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1097728 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa8c2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x559225566960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x55922657e960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x55922657f4a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x55922546e960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.590827942s of 24.071311951s, submitted: 2
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 23093248 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099528 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 23085056 heap: 136388608 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9e3a000/0x0/0x4ffc00000, data 0x1b82da6/0x1c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,6,11])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 22855680 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9e3a000/0x0/0x4ffc00000, data 0x1b82da6/0x1c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,17])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 22855680 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 29122560 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x55922546ef00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1800 session 0x5592267a72c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x5592267a6000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592263ffe00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eeb000 session 0x5592263feb40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1187056 data_alloc: 234881024 data_used: 10539008
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x5592263fef00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632000 session 0x559224742000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114098176 unmapped: 33841152 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f99a9000/0x0/0x4ffc00000, data 0x1c03da6/0x1cc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559223a27c00 session 0x559226f25680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 33513472 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226f25e00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 33513472 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1233177 data_alloc: 234881024 data_used: 16625664
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 33120256 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268441 data_alloc: 234881024 data_used: 21921792
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1c00 session 0x55922546e5a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268897 data_alloc: 234881024 data_used: 21934080
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117653504 unmapped: 30285824 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9984000/0x0/0x4ffc00000, data 0x1c27dc9/0x1ce8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.883409500s of 21.952882767s, submitted: 22
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126459904 unmapped: 21479424 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c8d000/0x0/0x4ffc00000, data 0x2916dc9/0x29d7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,5])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124387328 unmapped: 23552000 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c6a000/0x0/0x4ffc00000, data 0x2939dc9/0x29fa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124616704 unmapped: 23322624 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372079 data_alloc: 234881024 data_used: 22802432
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 7795 writes, 32K keys, 7795 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 7795 writes, 1759 syncs, 4.43 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1988 writes, 7632 keys, 1988 commit groups, 1.0 writes per commit group, ingest: 8.26 MB, 0.01 MB/s#012Interval WAL: 1988 writes, 772 syncs, 2.58 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372095 data_alloc: 234881024 data_used: 22802432
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 25452544 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 25436160 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372095 data_alloc: 234881024 data_used: 22802432
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122511360 unmapped: 25427968 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1372247 data_alloc: 234881024 data_used: 22806528
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.538951874s of 21.155471802s, submitted: 103
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122519552 unmapped: 25419776 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370448 data_alloc: 234881024 data_used: 22806528
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122527744 unmapped: 25411584 heap: 147939328 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,4,0,6])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 141074432 unmapped: 15261696 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1400 session 0x559225017c20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1500262 data_alloc: 234881024 data_used: 22806528
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79ef000/0x0/0x4ffc00000, data 0x3bbcdc9/0x3c7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08c00 session 0x559225473c20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79ef000/0x0/0x4ffc00000, data 0x3bbcdc9/0x3c7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 33464320 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02800 session 0x55922721ef00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.648444176s of 12.136721611s, submitted: 17
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b03000 session 0x559226552960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x55922546fc20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122544128 unmapped: 33792000 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1502946 data_alloc: 234881024 data_used: 22810624
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122544128 unmapped: 33792000 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135053312 unmapped: 21282816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549f800 session 0x559225470b40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139042816 unmapped: 17293312 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1625513 data_alloc: 251658240 data_used: 41050112
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139075584 unmapped: 17260544 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139149312 unmapped: 17186816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139288576 unmapped: 17047552 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f79cb000/0x0/0x4ffc00000, data 0x3be0dc9/0x3ca1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1625426 data_alloc: 251658240 data_used: 41050112
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139337728 unmapped: 16998400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 139460608 unmapped: 16875520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.569581985s of 12.389707565s, submitted: 232
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142147584 unmapped: 14188544 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7566000/0x0/0x4ffc00000, data 0x4044dc9/0x4105000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b05400 session 0x5592250174a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142508032 unmapped: 13828096 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142516224 unmapped: 13819904 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142516224 unmapped: 13819904 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142524416 unmapped: 13811712 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1678578 data_alloc: 251658240 data_used: 42459136
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f7562000/0x0/0x4ffc00000, data 0x4048dc9/0x4109000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 142532608 unmapped: 13803520 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b09000 session 0x55922669e780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248ef000 session 0x559225566960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.538358688s of 15.629971504s, submitted: 40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x5592255ff0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1381072 data_alloc: 234881024 data_used: 22806528
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 27238400 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382584 data_alloc: 234881024 data_used: 22806528
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8c67000/0x0/0x4ffc00000, data 0x2944dc9/0x2a05000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382584 data_alloc: 234881024 data_used: 22806528
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afd000 session 0x559226f24960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e8400 session 0x55922669e3c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 27230208 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.595676422s of 14.635678291s, submitted: 13
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226636c00 session 0x559226553e00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02400 session 0x55922723d860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123508 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x5592267a63c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f3400 session 0x55922721e5a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ec800 session 0x55922721fa40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226728800 session 0x559225016780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118669312 unmapped: 37666816 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.576278687s of 31.081003189s, submitted: 24
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb000 session 0x559226ffb860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592263fe1e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559226f25680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261edc00 session 0x559226f250e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4400 session 0x559226f25860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 37027840 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 37027840 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559225471860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194678 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119324672 unmapped: 37011456 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261eb000 session 0x559225472b40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261edc00 session 0x559226f252c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa800 session 0x559226f241e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 36855808 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c40000/0x0/0x4ffc00000, data 0x196cdf8/0x1a2c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 36839424 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119496704 unmapped: 36839424 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c1b000/0x0/0x4ffc00000, data 0x1990e08/0x1a51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260962 data_alloc: 234881024 data_used: 19304448
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119955456 unmapped: 36380672 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 36528128 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.900426865s of 10.097883224s, submitted: 55
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e1000 session 0x559227226b40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x559224742b40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x55922669e780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9c1b000/0x0/0x4ffc00000, data 0x1990e08/0x1a51000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1134230 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x559226785a40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592250174a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559224eea400 session 0x559225017680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559225016000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.801321030s of 13.943515778s, submitted: 49
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116449280 unmapped: 39886848 heap: 156336128 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b0000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,15])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1800 session 0x559225017c20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e800 session 0x5592254705a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4800 session 0x559225473e00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592248f1c00 session 0x55922721eb40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x55922721f0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f937a000/0x0/0x4ffc00000, data 0x2232da6/0x22f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264351 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e4800 session 0x559224ef01e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 47742976 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f937a000/0x0/0x4ffc00000, data 0x2232da6/0x22f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b02c00 session 0x5592255661e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 117309440 unmapped: 47423488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303380 data_alloc: 234881024 data_used: 15511552
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 43442176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125919232 unmapped: 38813696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9355000/0x0/0x4ffc00000, data 0x2256dc9/0x2317000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125952000 unmapped: 38780928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.294668198s of 11.932563782s, submitted: 37
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549f000 session 0x55922657fe00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676e400 session 0x5592272270e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125952000 unmapped: 38780928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118267904 unmapped: 46465024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226552b40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 46399488 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: mgrc ms_handle_reset ms_handle_reset con 0x5592249d8000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/4198923246
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/4198923246,v1:192.168.122.100:6801/4198923246]
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: mgrc handle_mgr_configure stats_period=5
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x559225471a40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1146182 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118439936 unmapped: 46292992 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.419353485s of 23.247339249s, submitted: 36
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147694 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f23000 session 0x559226784f00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118448128 unmapped: 46284800 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559227766000 session 0x559226785c20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b00800 session 0x5592254725a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x5592254730e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.210437775s of 15.619210243s, submitted: 1
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592263ff2c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220376 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afb400 session 0x5592270fd860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118603776 unmapped: 46129152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x5592270fc960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d7400 session 0x559225567c20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118620160 unmapped: 46112768 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118620160 unmapped: 46112768 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226ffbc20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221149 data_alloc: 234881024 data_used: 10539008
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 118652928 unmapped: 46080000 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279061 data_alloc: 234881024 data_used: 19120128
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b28000/0x0/0x4ffc00000, data 0x1a85d96/0x1b44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279061 data_alloc: 234881024 data_used: 19120128
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119185408 unmapped: 45547520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.080177307s of 20.722246170s, submitted: 23
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 43720704 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9265000/0x0/0x4ffc00000, data 0x2340d96/0x23ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126017536 unmapped: 38715392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 38182912 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1390845 data_alloc: 234881024 data_used: 19333120
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126550016 unmapped: 38182912 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec0000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126566400 unmapped: 38166528 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 39575552 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec9000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ec9000/0x0/0x4ffc00000, data 0x26e4d96/0x27a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1382581 data_alloc: 234881024 data_used: 19349504
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 39542784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1384645 data_alloc: 234881024 data_used: 19349504
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f8ea1000/0x0/0x4ffc00000, data 0x270cd96/0x27cb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125485056 unmapped: 39247872 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592270fc000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.293542862s of 14.293901443s, submitted: 116
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226f250e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x55922677f2c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162987 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 44580864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d5800 session 0x5592272292c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592247d4800 session 0x559226fee3c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x5592254efe00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226785c20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.185562134s of 22.248020172s, submitted: 32
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x5592254725a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261e6000 session 0x55922723c960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x55922657e1e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x5592270fde00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x559225473680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209420 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3c800 session 0x559225017a40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa07c000/0x0/0x4ffc00000, data 0x1530da6/0x15f0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225656800 session 0x55922669f2c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 43679744 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x559226f24780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x559226ffa3c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 45318144 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216924 data_alloc: 234881024 data_used: 10543104
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119414784 unmapped: 45318144 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa056000/0x0/0x4ffc00000, data 0x1554dd9/0x1616000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1246108 data_alloc: 234881024 data_used: 14667776
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa056000/0x0/0x4ffc00000, data 0x1554dd9/0x1616000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1246108 data_alloc: 234881024 data_used: 14667776
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 45309952 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.542800903s of 17.671800613s, submitted: 41
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 122945536 unmapped: 41787392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124780544 unmapped: 39952384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9869000/0x0/0x4ffc00000, data 0x1d41dd9/0x1e03000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125403136 unmapped: 39329792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f97dc000/0x0/0x4ffc00000, data 0x1dc8dd9/0x1e8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226864c00 session 0x55922677f860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226863400 session 0x55922721f0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922723b000 session 0x559226aeb680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922723b000 session 0x559225016000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 124108800 unmapped: 40624128 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549dc00 session 0x55922669e1e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226632400 session 0x55922657f4a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226863400 session 0x559224649680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226864c00 session 0x559224ef0f00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225656c00 session 0x55922723cf00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386172 data_alloc: 234881024 data_used: 16515072
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 39067648 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 39067648 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9129000/0x0/0x4ffc00000, data 0x2477e4b/0x253b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226634400 session 0x55922723da40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1379724 data_alloc: 234881024 data_used: 16515072
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922549c400 session 0x5592267852c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226af4400 session 0x559225625680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x5592270fd860
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 39034880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.054047585s of 10.986348152s, submitted: 167
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126746624 unmapped: 37986304 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 126754816 unmapped: 37978112 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128802816 unmapped: 35930112 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419509 data_alloc: 234881024 data_used: 22065152
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128811008 unmapped: 35921920 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419509 data_alloc: 234881024 data_used: 22065152
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128843776 unmapped: 35889152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910f000/0x0/0x4ffc00000, data 0x2498e6e/0x255d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.671233177s of 10.673833847s, submitted: 1
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128876544 unmapped: 35856384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f910c000/0x0/0x4ffc00000, data 0x249be6e/0x2560000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 133996544 unmapped: 30736384 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1494901 data_alloc: 234881024 data_used: 23621632
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 133668864 unmapped: 31064064 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134979584 unmapped: 29753344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1505309 data_alloc: 234881024 data_used: 24363008
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d4000/0x0/0x4ffc00000, data 0x2cd3e6e/0x2d98000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 134987776 unmapped: 29745152 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.098537445s of 12.288821220s, submitted: 94
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1502669 data_alloc: 234881024 data_used: 24371200
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135053312 unmapped: 29679616 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d2000/0x0/0x4ffc00000, data 0x2cd4e6e/0x2d99000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88d2000/0x0/0x4ffc00000, data 0x2cd4e6e/0x2d99000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1503037 data_alloc: 234881024 data_used: 24436736
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88cd000/0x0/0x4ffc00000, data 0x2cdae6e/0x2d9f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135061504 unmapped: 29671424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ecc00 session 0x55922669f2c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135069696 unmapped: 29663232 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 135094272 unmapped: 29638656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348975 data_alloc: 234881024 data_used: 16515072
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.806308746s of 10.239793777s, submitted: 37
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f88ce000/0x0/0x4ffc00000, data 0x2cdae5e/0x2d9e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,2])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dfc/0x2009000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dfc/0x2009000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f23400 session 0x559226784f00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347011 data_alloc: 234881024 data_used: 16498688
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347011 data_alloc: 234881024 data_used: 16498688
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132030464 unmapped: 32702464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132038656 unmapped: 32694272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.272031784s of 13.655331612s, submitted: 34
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b08800 session 0x55922546f0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9663000/0x0/0x4ffc00000, data 0x1f46dd9/0x2008000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1346027 data_alloc: 234881024 data_used: 16498688
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3c800 session 0x5592254ef0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132055040 unmapped: 32677888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127320064 unmapped: 37412864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1199806 data_alloc: 234881024 data_used: 10649600
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa48d000/0x0/0x4ffc00000, data 0x111fdb9/0x11df000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226865000 session 0x559223b86960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127336448 unmapped: 37396480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127344640 unmapped: 37388288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127361024 unmapped: 37371904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192310 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b1000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127369216 unmapped: 37363712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ef000 session 0x559226aea3c0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657800 session 0x5592263feb40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657400 session 0x559226fef0e0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3ac00 session 0x55922669f680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.472564697s of 42.319786072s, submitted: 56
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127352832 unmapped: 37380096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127377408 unmapped: 37355520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x55922676f400 session 0x5592247d85a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657400 session 0x5592263fe780
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559225657800 session 0x55922721fe00
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156edcf/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1234189 data_alloc: 234881024 data_used: 10539008
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261ef000 session 0x559224648000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226e3ac00 session 0x559226fee960
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127959040 unmapped: 36773888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1234133 data_alloc: 234881024 data_used: 10539008
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 2.432877302s of 11.771712303s, submitted: 33
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b0a000 session 0x559225473c20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1235130 data_alloc: 234881024 data_used: 10539008
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127991808 unmapped: 36741120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127967232 unmapped: 36765696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265986 data_alloc: 234881024 data_used: 15028224
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127385600 unmapped: 37347328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265986 data_alloc: 234881024 data_used: 15028224
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.148657799s of 15.006252289s, submitted: 6
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 127393792 unmapped: 37339136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa03d000/0x0/0x4ffc00000, data 0x156ee08/0x162f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 132644864 unmapped: 32088064 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302734 data_alloc: 234881024 data_used: 15024128
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f9b43000/0x0/0x4ffc00000, data 0x1a68e08/0x1b29000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,2,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130285568 unmapped: 34447360 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129679360 unmapped: 35053568 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f997a000/0x0/0x4ffc00000, data 0x1c29e08/0x1cea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,4])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129687552 unmapped: 35045376 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328778 data_alloc: 234881024 data_used: 15020032
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96ff000/0x0/0x4ffc00000, data 0x1eace08/0x1f6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130875392 unmapped: 33857536 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1.013013244s of 10.206089973s, submitted: 61
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96ff000/0x0/0x4ffc00000, data 0x1eace08/0x1f6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1333106 data_alloc: 234881024 data_used: 15360000
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130883584 unmapped: 33849344 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96e2000/0x0/0x4ffc00000, data 0x1ec9e08/0x1f8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348124 data_alloc: 234881024 data_used: 15777792
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130916352 unmapped: 33816576 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 33800192 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.314796448s of 11.320782661s, submitted: 31
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x5592261f1400 session 0x55922669fc20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226f22000 session 0x559226fef4a0
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129081344 unmapped: 35651584 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4f96e2000/0x0/0x4ffc00000, data 0x1ec9e08/0x1f8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207674 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129089536 unmapped: 35643392 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226afa000 session 0x559226aeb680
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129122304 unmapped: 35610624 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129130496 unmapped: 35602432 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129138688 unmapped: 35594240 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129146880 unmapped: 35586048 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129163264 unmapped: 35569664 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129179648 unmapped: 35553280 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129179648 unmapped: 35553280 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129187840 unmapped: 35545088 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129228800 unmapped: 35504128 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'config show' '{prefix=config show}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128770048 unmapped: 35962880 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128917504 unmapped: 35815424 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'log dump' '{prefix=log dump}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'perf dump' '{prefix=perf dump}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'perf schema' '{prefix=perf schema}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128909312 unmapped: 35823616 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 234881024 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128950272 unmapped: 35782656 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128958464 unmapped: 35774464 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128966656 unmapped: 35766272 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128974848 unmapped: 35758080 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128983040 unmapped: 35749888 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128991232 unmapped: 35741696 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 128999424 unmapped: 35733504 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 9973 writes, 39K keys, 9973 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 9973 writes, 2660 syncs, 3.75 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2178 writes, 7712 keys, 2178 commit groups, 1.0 writes per commit group, ingest: 7.91 MB, 0.01 MB/s#012Interval WAL: 2178 writes, 901 syncs, 2.42 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129007616 unmapped: 35725312 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129015808 unmapped: 35717120 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129024000 unmapped: 35708928 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129032192 unmapped: 35700736 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129032192 unmapped: 35700736 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 232.847518921s of 234.084274292s, submitted: 40
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129032192 unmapped: 35700736 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129056768 unmapped: 35676160 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129064960 unmapped: 35667968 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129064960 unmapped: 35667968 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129073152 unmapped: 35659776 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129073152 unmapped: 35659776 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129097728 unmapped: 35635200 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129155072 unmapped: 35577856 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129171456 unmapped: 35561472 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa4b2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129253376 unmapped: 35479552 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 3.775661469s of 10.127549171s, submitted: 153
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129286144 unmapped: 35446784 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,2])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129294336 unmapped: 35438592 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 35381248 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 35381248 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 35364864 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 35356672 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 35348480 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 35340288 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 35274752 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 35266560 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 35258368 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:58.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 35250176 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 35241984 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 ms_handle_reset con 0x559226b06000 session 0x5592255fed20
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 35233792 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 35225600 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 35217408 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 35209216 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 35201024 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 35192832 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 35184640 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129556480 unmapped: 35176448 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129564672 unmapped: 35168256 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 35332096 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 35323904 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 35315712 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 35307520 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:58 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 35299328 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 35291136 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1206170 data_alloc: 218103808 data_used: 10534912
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 35282944 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'config show' '{prefix=config show}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129204224 unmapped: 35528704 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: prioritycache tune_memory target: 4294967296 mapped: 129359872 unmapped: 35373056 heap: 164732928 old mem: 2845415832 new mem: 2845415832
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fa0a2000/0x0/0x4ffc00000, data 0x10fbd96/0x11ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Jan 23 05:40:58 np0005593295 ceph-osd[81231]: do_command 'log dump' '{prefix=log dump}'
Jan 23 05:40:58 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:40:58 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:40:58 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:58.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:40:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 05:40:58 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3124684249' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 05:40:58 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 05:40:58 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/666089249' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 05:40:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:40:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:59 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:40:59 2026: (VI_0) received an invalid passwd!
Jan 23 05:40:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 05:40:59 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1596010367' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 05:40:59 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 05:40:59 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/239682889' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 05:41:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 05:41:00 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1745865103' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 05:41:00 np0005593295 nova_compute[225701]: 2026-01-23 10:41:00.323 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:00.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:00 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:00 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:00 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:00 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:00 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:00.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 05:41:00 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/875828868' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 05:41:00 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 05:41:00 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2414639810' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 05:41:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:01 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:01 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:01 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 23 05:41:01 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1637687014' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 05:41:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/233774706' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 05:41:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:02.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:02 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:02 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3108936715' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 05:41:02 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:02 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:02 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:02.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1660365616' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3345460299' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 23 05:41:02 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1102334395' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 05:41:02 np0005593295 nova_compute[225701]: 2026-01-23 10:41:02.924 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:03 np0005593295 systemd[1]: Starting Hostname Service...
Jan 23 05:41:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 23 05:41:03 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2385619581' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 05:41:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 23 05:41:03 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/369562290' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 05:41:03 np0005593295 systemd[1]: Started Hostname Service.
Jan 23 05:41:03 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:03 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 23 05:41:03 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3665764273' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 05:41:03 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 23 05:41:03 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4166326524' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 05:41:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 23 05:41:04 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2110475920' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 05:41:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 23 05:41:04 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3690345591' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 05:41:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:04.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:04 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:04 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:04 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:04 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 23 05:41:04 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:04.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 23 05:41:04 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 23 05:41:04 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/924431456' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 05:41:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:05 np0005593295 nova_compute[225701]: 2026-01-23 10:41:05.325 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:05 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:05 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:05 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 23 05:41:05 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/631640421' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 05:41:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 23 05:41:06 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1856915728' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 05:41:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:06.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:06 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:06 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:06 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:06 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:06 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:06.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:06 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 23 05:41:06 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/444520388' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 05:41:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:07 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:07 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:07 np0005593295 nova_compute[225701]: 2026-01-23 10:41:07.926 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 23 05:41:08 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2866924100' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 05:41:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:08.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:08 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:08 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:08 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:08 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 23 05:41:08 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:08.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 23 05:41:08 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 23 05:41:08 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/704825705' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 05:41:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:09 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:41:09 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:41:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 05:41:09 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2035341214' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 05:41:09 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 05:41:09 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 05:41:09 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:09 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:09 np0005593295 ceph-mon[75771]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 23 05:41:09 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1386926707' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 05:41:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-rgw-default-compute-2-qpmsjd[85164]: Fri Jan 23 10:41:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:10 np0005593295 nova_compute[225701]: 2026-01-23 10:41:10.327 225706 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:10 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 05:41:10 np0005593295 ceph-mon[75771]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 05:41:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:10.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:10 np0005593295 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-2-pawaai[84154]: Fri Jan 23 10:41:10 2026: (VI_0) received an invalid passwd!
Jan 23 05:41:10 np0005593295 radosgw[82185]: ====== starting new request req=0x7f821c23b5d0 =====
Jan 23 05:41:10 np0005593295 radosgw[82185]: ====== req done req=0x7f821c23b5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:10 np0005593295 radosgw[82185]: beast: 0x7f821c23b5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:10.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
